Gender Gaps in the Tech Industry: Addressing Biased Language in Job Advertisements

Image by Mary Shin

Professor of Science and Technology Studies (STS) Janet Abbate writes in her essay in Your Computer Is on Fire, titled “Coding is Not Empowerment,” that according to the National Science Foundation, between “1993 and 2013, the percentage of women in computer science occupations actually declined from 31 percent to 24 percent.”

Considering this gender gap, many organizations and programs, including and Girls Who Code, label the problem as a “pipeline” issue, wherein gender gaps are assumed to exist because women and minorities are not given the necessary education to pursue jobs in tech. However, Dr. Abbate asserts that rather than focusing on the “pipeline” argument, “a more promising alternative is to identify and combat bias in the practices involved in recruiting, hiring, promotion, and funding. The problem begins with recruiting.”

As someone who is majoring in STS and who will be entering the tech industry upon graduation, I find Dr. Abbate’s research both deeply compelling and problematic. I agree with her in that recruiting has big areas for improvement. However, I believe the issue starts with how these tech companies attract candidates in the first place: i.e., job advertisements.

Job advertisements are important because they provide a peek into the larger organizational culture of each company. Although it is a small space and cannot resolve systemic social inequalities, it is nonetheless worth researching, evaluating, and challenging as a means of increasing diversity across the tech industry.

From a business and economic standpoint, using gendered wording in job ads limits the amount of talent that is attracted to the business. From a social equity and inclusion perspective, gendered wording negatively impacts diversity across many industries.

Using more gender-inclusive verbiage attracts a wider applicant pool, which leads to increased diversity in the hiring and recruitment process.

Throughout this article, I will be exploring a case study on Data Science job postings. By using textual analysis methods with Python, I examine 10,000 different job ads in the U.S. to figure out whether gender-biased wording exists throughout my data set, which was retrieved from Kaggle. I will address and answer the following questions:

1. What kinds of gender-biased wording, if any, exist throughout this data set?

2. How does language used in job advertisements affect the applicant pool?

3. If gender-biased wording is the problem, then how do we resolve this issue?

Brief Literature Review

Before diving into my textual analysis, it is necessary to review the relevant literature on this topic to determine how job advertisements can impact the way in which potential candidates apply to different jobs, and why this issue is important to address.

Researchers found that “job advertisements for male-dominated areas employed greater masculine wording (i.e., words associated with male stereotypes, such as leader, competitive, dominant) than advertisements within female-dominated areas.”[1]

In addition, the study revealed that “women were described as more communal and less agentic than men, suggesting that language use can unintentionally reflect stereotypical gender roles.”

These results indicate that the language used in job advertisements, especially for male-dominated industries, is inevitably perpetuating pre-existing stereotypes and gender gaps.

Another study found that in Computer Science, stereotypes could discourage “women’s sense of ambient belonging and subsequent interest in the environment but had no similar effect on men.”[2] Evidently, environment cues and gender stereotypes can become deciding factors for determining women’s interests in this field.

There has been plenty of research done on the impact of gendered wording on women. For increased diversity in the applicant pool, although more education is constructive, it is not enough. What the tech industry needs is solutions and actionable steps to challenge their hiring process, starting with how job advertisements are written.

Case Study: Data Science Job Ads

Using Python, I ran textual analysis to explore the data set on 10,000 job ads in Data Science. My hypothesis was that these job descriptions will tend to include more masculine-coded words such as strong, leader, analytical, competitive, and aggressive.

Through my data exploration, I found that some of the most commonly used words throughout the job ads include:

  • Analysis (20,117)
  • Strong (11,102)
  • Complex (8,906)

Analysis” was found in 68% of all the postings, “strong” in 53%, and “complex” in 47%.

Image by Mary Shin; Source: Kaggle

These words consistently appeared across the job descriptions in my data set. Although “analysis” might just be referring to the nature of the Data Science job, words like “strong” (which appeared in more than half of all the postings) and “complex” tend to come off as more masculine-coded and can affect the ways in which potential applicants view the job and company.

Furthermore, I also looked at key context surrounding several masculine-coded words in all of these job ads:


Image by Mary Shin; Source: Kaggle


Image by Mary Shin; Source: Kaggle


Image by Mary Shin; Source: Kaggle


Image by Mary Shin; Source: Kaggle

Although these are only a few examples, my findings show that many of these words are used in a context that seems more gender-biased than it needs to be. We can think about how these words could deter qualified people from applying, simply due to their misinterpretation that the position is a better fit for someone else who aligns with more masculine characteristics.

For instance, what does it tell us about a company when the job description includes that the applicant must be able to “solve problems independently”? What kind of candidate will this attract versus discourage? There are many implicit signals that this type of language can convey, and thus companies should be more vigilant about how they present their opportunities.


Based on my research and analysis, I propose that companies (especially in the tech industry) should re-evaluate their organizational culture and take the following actionable steps:

1. Replace “nice-to-haves” with requirements

Harvard Business Review found that women tend to refrain from applying to jobs unless they are 100% qualified. According to Rachel Pacheco, management author, lecturer, and researcher at The Wharton School, clarifying the values and narrowing down the qualifications to only include the most necessary requirements will encourage a more diverse pool of applicants.[3]

Recruiters should seriously evaluate what counts as “most necessary” and clearly articulate the top 3–5 skills. This does not mean the standards must be lowered. Rather, it means that the role description will more accurately reflect the actual demands as well as the success metrics of the job.

2. Balance gender-neutral wording

Use inclusive language by dropping the third person (refrain from he/she) and instead, utilizing plural or second person pronouns to address the applicant. Job descriptions should not discourage people from applying for the position based on their social, racial, or gender identity.

3. Be more proactive than reactive

Companies should become more proactive and take the necessary steps to combat biased language before actually writing job ads, instead of relying on reactive measures such as running their job descriptions through bias decoder sites, which contain algorithms that will parse out masculine vs. feminine-coded words in any text. It is a useful tool, but it does not get us very far in terms of how we can combat bias before the job advertisement is posted.

Key Takeaways

In conclusion, job advertisements reveal the company values and play an undeniably important role in shaping diversity among applicants. Shifting the culture of any organization, and of the industry at large, starts with the people who get recruited.

To work towards a more equitable recruitment process, the three actionable steps above can help companies to identify, challenge, and remove biases and ultimately support increased diversity in the tech industry.


  1. Gaucher, D., Friesen, J., & Kay, A. C. (2011). “Evidence that gendered wording in job advertisements exists and sustains gender inequality.” Journal of Personality and Social Psychology, 101(1), 109–128.
  2. Cheryan, S., Plaut, V. C., Davies, P. G., & Steele, C. M. (2009). “Ambient belonging: how stereotypical cues impact gender participation in computer science.” Journal of Personality and Social Psychology, 97(6), 1045–1060.
  3. Pacheco, Rachel. “Personality: Selection and Hiring.” MGMT 238: Organizational Behavior. Class lecture at The Wharton School of The University of Pennsylvania, Philadelphia, PA, September 13, 2021.




Science, Technology & Society at the University of Pennsylvania

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

A Cynic’s Guide to Surviving a Job Search

Life and Leadership Lessons I Learned In The Military: “Embrace the loneliness.”

The Graduate’s Guide to Nailing That Interview Presentation

Salary & Exploitation : Covid crisis no one is talking about.

Is Working From Home The Future Of Work?

Rising Star Madelyn Miranda: “I’d like to inspire a movement to ask everyone to start a…

Is Your Resume Telling a Story?

Child reading book under a blanket and with a flashlight

Janice Bryant Howroyd: “The advice I offer female leaders is the same I offer anyone: Be honest…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Mary Shin

Mary Shin

Science, Technology & Society at the University of Pennsylvania

More from Medium

Is Jeopardy! the Ultimate Data Scientist Gameshow?

Top 5 Tips for Aspiring Data Scientists Regarding Self-Study

I Failed my Technical Interview, and That’s Alright

Virtuo’s approach to data science