• ALL
  • CATEGORIES

We need to talk about gender bias in tech

Illustrative women pushing female symbol in a browser screen, up a hill

Whilst the noise around gender bias in the technology industry is getting louder, as it is across a range of sectors, men still lead the research teams which inform it. This means fundamental fields of research, such as government policy, experimental medicine, emerging technology, and urban planning, still carry inherent gender biases.

In turn, these unevenly represented research teams, be it consciously or unconsciously, skew the data they produce. Data which is shaping the reality of women, 50% of the world’s population. A must-read, Invisible Women: Exposing Data Bias in a World Designed for Men by Caroline Criado-Perez, dives into real-life examples where such data seriously hampers, and in extreme cases endangers, the lives of women.

 

Here are two data discrepancies which will likely shock you:

  • Smartphones are designed for men’s hands, which means manufacturer’s models are too big for women to fit in their palms, pockets or wallets. The only mainstream phone made in a small size was Apple’s iPhone SE, which was discontinued after just two years.
  • Risk-prediction models for cardiac disease still rely on majority male patient research. The most commonly wheeled out models use tests which surveyed two thirds men, and just one third women. Despite the two’s differing anatomy. Heart disease is thought to affect older males en masse, an overweight man being the first stock image you’ll find on Google for the disease. That’s despite numerous researchers indicating women from lower socioeconomic backgrounds are 25% more likely to suffer from a heart attack than the common older man.

 

Carrying on the tradition with COVID-19 research

The George Institute for Global Health, a medical research unit in the University of Oxford, found significant gender bias in research authorship relating to COVID-19. In other words, women’s views are not equally shaping the response to the pandemic as men’s are. 

With limited influence over how the crisis is being framed in the news, as well as having little say over policy decisions the government runs with to react to the pandemic, women are at greater risk of being further marginalized during one of the most significant global health crises in our lifetime – and likely many to come.

 

How technology data bias preserves gender inequality

Data is increasingly being used to shape the world, as well as to record its activities to date. Not only does its bias endanger the lives of women, it also preserves a less life-threatening – but just as wrong – unequal status-quo. Women are still fighting to be paid the same as men, being forced to choose between having children or progressing in their careers to keep up with men. The invisible, casual bias which pervades our day-to-day lives dictates the way men (and women) perceive a female’s place in society. Here are some examples of this in the tech industry:

 

Siri, Alexa and Cortana

An alarming number of virtual assistants are named after women. As Emily Nicolle points out in her investigation into the devices for Financial News: “Microsoft’s Cortana was named after an artificial intelligence (AI) from the Halo video game franchise depicted as a ‘sensuous unclothed woman’, while Siri translates from Norse to mean ‘beautiful woman who leads you to victory’.”

In her findings, Nicolle cites studies by the United Nations body UNESCO which prove naming virtual assistants after women perpetuates the notion that they are subservient, both to men and the world at large. 

What happens when you give a 5-year-old boy an option to ask for something without a ‘please’ or ‘thank you’ from a submissive woman’s voice? What does it teach him about the world and what it “owes” him?

It’s not surprising that women’s names are overwhelmingly used to market this technology, when you consider – as Nicolle also points out – that in 2020 just 22% of the AI intelligence workforce was made up of women, according to the International Telecommunication Union.

 

AI programming

The word embedding models used to power conversational bots and word searches provide another instructive example of gender bias in tech. In an article for The Guardian, Lizzie O’Shea writes: 

“Researchers have experimented with one of these word-embedding models, Word2vec, a popular and freely available model trained on three million words from Google News. They found that it produces highly gendered analogies. 

“For instance, when asked “Man is to woman as a computer programmer is to ?”, the model will answer “homemaker”. Or for “father is to mother as doctor is to ?”, the answer is “nurse”. 

“This bias, reflecting social discrimination, will now be reproduced and reinforced when we engage with computers using natural language that relies on Word2vec. It is not hard to imagine how this model could also be racially biased, or biased against other groups.”

 

Criado-Perez’s examples

Speech recognition software is trained on recordings of male voices. As a result, Google’s version is 70% more likely to understand a man, than it is a woman. One female user reported that her car’s voice-command system only listened to her husband, even when he was sitting in the passenger seat.

VR headsets are designed in a way which means women are more likely to feel sick while wearing one. According to a VentureBeat investigation, the default interpupillary distance (IPD) – that is, the distance between the eyes – for a VR headset is larger than the average IPD of the population. A mismatched IPD is like wearing glasses which aren’t your prescription. This is why the IPD can cause motion sickness if it’s too wide. Men have higher IPDs than women on average, so a VR headset’s design suits men better than women.

Fitness monitors underestimate steps during housework by up to 74%. This means users complain that they don’t count steps taken while pushing a pram.

 

What can we do?

Changing data is not an easy win. We, as an industry, need to make a concerted effort to collectively rethink the way we design and collect data. Here are some tips, inspired by Forbes contributor Shaheena Janjuha-Jivraj:

  1. Researchers and designers should always question themselves and their research. Do you have a diverse combination of insights which analyse and question the findings? 
  2. Build inclusive and diverse teams to ensure every voice is heard by a piece of new technology.
  3. Be aware. Raise your voice. Understand inequalities and actively look for them.

 

To learn more

Read Invisible Women: Exposing Data Bias in a World Designed for Men by Caroline Criado-Perez

As well as Do it Like a Woman: ... and Change the World also by Caroline Criado-Perez

Listen to a podcast called 99% Invisible

Follow Caroline Criado-Perez on Instagram

And sign up to her newsletter

Leave a reply

You can use these tags:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  1. […] to make sure that, if we’re entrusting AI with the power to analyze data and make decisions, the AI doesn’t have blind spots and isn’t programmed to perpetuate biases. We owe it to ourselves to show that science and art exist together, in tandem, and we need […]

Sign up for the Manifesto newsletter and exclusive event invites