New research suggests that technically oriented women could face gender discrimination in their jobs at high-tech firms in part because of mismanaged projects.

Tech firms rely excessively on a "hero mindset" to save runaway coding projects that are poorly organised, and employees with family responsibilities, often women, are sacrificed as a result, according to the report.

"The Recruitment, Retention, and Advancement of Technical Women: Breaking Barriers to Cultural Change in Corporations" was published by the Anita Borg Institute for Women and Technology, a nonprofit organisation that focuses on the role of women at high-tech firms.

There's also evidence of bias against women in recruitment and job assignment in places where high-tech corporate cultures thrive on this "hero mindset" that "rewards a 'last minute' crunch where 24/7 work becomes necessary to 'save' a project, failing to acknowledge family responsibilities and flexibility needs," the report says.

This fly-by-the-seat-of-your-pants workaday world represents a pattern that's grown mainly because an organisation "poorly defines requirements and project management."

Silicon Valley's sometimes frantic firefighting pace and in-your-face communications style produce many technical cultures that "leave women feeling isolated and crushed," according to the report.

The report reflects what 59 senior business and technical managers, both men and women, confided in a closed forum organised last October by the Palo Alto-based Anita Borg Institute. The participants came from companies including Cisco, Facebook, Goldman Sachs, Google, HP, IBM, Intel, Microsoft and Symantec.

The report says it's common in the high-tech world to find the modern equivalent of the "good old boys network" that tends to hire "people who are like them."

Technical women these days are "still a rarity," says the report's author Dr. Carolyn Simard. She notes women in the United States earn just 18% of college computer science degrees, which is sharply down from the 37% level tracked in 1985. However, the need for technical demand is expected to grow by as much as 32% by 2018.

A second report published by the Anita Borg Institute, "Senior Technical Women: A Profile of Success," draws on a survey of about 1,800 participants from seven unidentified high-tech firms in Silicon Valley. It found that women now hold about 4% of the senior-level technical positions at high-tech firms, and an estimated one-quarter of all technology jobs. At the higher level, women more frequently end up in a manager job than men (36.9% of women compared to 19% of men), who are much more likely to be in what's called an "individual contributor position" in technical coding jobs.

The second study also found that men and women in technical jobs value many of the same attributes for success, such as being analytical, questioning, risktaking, collaborative, entrepreneurial, assertive, working long hours and being sociable.

Women far more often than men have "primary responsibility for the household," the study shows, though senior-level technical women are much more likely to have a partner who has primary responsibility for the household/children (23.5% of partnered senior women) in comparison with entry/midlevel women (13.4%). Senior-level technical women also are more likely than their male counterparts to forego a partner and children because they think it will drag down their careers.

To improve work-life balance in the high-tech world and stop any perceived gender bias against women, the Anita Borg Institute is putting forward a few ideas certain to generate debate and controversy.

One of the recommendations suggests that because there's evidence that women are eliminated in the hiring process at the resume-review level, companies might consider "that all women candidates should at least get an interview."

A "software tool for detecting bias," which was proposed at the Institute's forum last October, could use language recognition to zero in on everything from performance evaluations to letters of recommendation that exhibit gender bias.

The report says one online tool like this can be found at Harvard's Project Implicit, and says "we envision building on such research to create a system where specific language can be fed and analyzed for the existence of bias."

"Using machine learning and text analyses methods would help organisations and individuals address the existence of bias before the damaging language is formally used in recommendations or evaluations" and would be a "high-impact diagnostic tool for calibrating organisations" with regard to hiring and promotion decisions," the report says.