Subscribe To Our Newsletter

Get tips and tools to tell your data story better.

No, thanks

 In Current Events, Data Analysis Concepts Simplified, Experts, How To

The New York Times recently ran a piece called Facial Recognition Is Accurate, if You’re a White Guy. We do a lot of work with groups aiming to reduce gender bias and bias against marginalized groups. So of course, the article caught my eye. But not for the reasons you might think.

I was, frankly, a bit boggled this was even a story.

The piece in question explores recent research that shows artificial intelligence cannot accurately identify a person’s gender using facial recognition technology. I’m fascinated by new developments in social tech. I’m even more interested in the potential consequences of putting our lives in the hands of new algorithms and AI. But why is it news that a computer can’t identify a person’s gender just by looking at them? Do we think that any human can automatically identify another human’s gender with a simple glance at their face?

Gender Bias in Big Data

So big data isn’t going to solve our gender bias problems?

Not really, no. As we’re fond of saying around here, more biased data doesn’t equal less bias. I’ve written before about big data problems we need to be careful to avoid — and hidden bias is right up there near the top. (And it’s not just about gender — Lena Groeger is quick to point out multiple examples of gender, racial, and economic bias in data visualizations.)

There is a fundamental flaw with asking computers to sort humans into groups. (Any groups, but for now, let’s stick with gender.) The problem is that our own gender bias gets baked right into the technology. And we often never go back and re-examine how the algorithm was developed in the first place.

To use a really simplified example, let’s travel back in time to the early 1900s. Non-binary gender wasn’t a recognized thing. Men and boys in the west typically wore their hair short, while women and girls generally wore theirs longer. Now let’s pretend we had facial recognition software back then. How would you teach a computer to identify between males and females? Hair length would probably be a major factor.

Now say we employed this technology during elections. (Remember, women are widely not allowed to vote yet.) An unusual gentleman with slightly longer tresses than average steps up — and is denied his right to vote. Because someone programming the system had a gender bias that carried over into the technology.

Going Beneath the Surface

Obviously, the example above is basic. Equally obviously, now that we do have facial recognition technology and algorithms that attempt to sort us and predict our behaviours, we also have a lot more criteria to work with when determining gender.

For starters, it’s now widely accepted that gender is not binary. So merely sorting people into male vs. female just isn’t going to cut it. There are also a number of different interpretations of gender — biological sex, gender identity, gender expression. This article details some of the risks and challenges associated with collecting data on gender:

“Even when individuals do not volunteer this information to an application, it may be possible to algorithmically infer gender or sexual orientation from knowledge of their social networks. This can pose risks of repercussion, either in the form of personal shame for people who have hidden their gender identity or even discrimination, violence and imprisonment depending on the context and community where they live.”

What Can You Do?

Recognizing gender bias (or any kind of bias, really) is a good start. Remember the post about lawmakers who couldn’t accurately estimate what percentage of their coworkers or constituents were female? Understanding that all data comes with bias attached (the bias of the people who collected, analysed, and visualized it, as well as your own as you interpret it) is important too.

If you need help dealing with gender bias in your data, or if you want to learn more about this issue, we can help. Get in touch with the team at Datassist today.

Recommended Posts

Start typing and press Enter to search

We like to believe mapping is a science, but misleading maps are more common than you think.