Subscribe To Our Newsletter

Get tips and tools to tell your data story better.

No, thanks

 In Current Events, Data Analysis Tools, Equity, Experts

I’ve been working with data for decades. And over the years, I’ve found myself increasingly uncomfortable with the way we teach, collect, and communicate it. Inequity and bias in data science are everywhere.

 

Science That’s Not Quite Scientific

Data science gives us the ability to achieve change at an unprecedented scale and pace. But it’s not without problems. The “capital-S Science” part of data science offers the impression the numbers we work with are objective; without bias. And data experts’ use of confusing jargon in place of regular, easy to understand phrasing doesn’t help.

But the problem is fixable. We can work to uncover bias in data science with three simple strategies.

Demystify.

Democratize.

Demonstrate.

Just like in the Wizard of Oz, we must pull back the curtain so the world can see that data (and data scientists) are not infallible. We need to start a conversation about bias in data science that everyone can participate in.

 

A Matter of Life and Death

More and more, algorithms make decisions on our behalf. We rely on them to determine what gets purchased, what gets funded. Algorithms that many of us don’t even understand tell us whose project is working and which policies are effective.

Algorithms decide who lives and who dies.

Data is changing the world — literally. But thinking of it as a hard science gives us the very mistaken impression that statistics aren’t subjective. That data equals fact. That relying on an algorithm to make our decisions makes life objectively fairer.

But bias in data science permeates our systems. It’s hidden in every step of the process; data collection, analysis, communication, and visualization are rife with inequality-causing assumptions, misunderstandings, blind spots, shortcuts, and outright errors.

How did this happen?

Data science finds its roots in the work of a bunch of classically trained, western-oriented white guys. And like it or not, their perspective is inherently infused in the work of today’s data scientists. (You know, the ones running the Whole. Damn. World.)

Data scientists, journalists, policy makers, visualizers, governments, NGOs, and citizens everywhere need to get a handle on this problem. We need to acquire the tools and understanding to catch and fix these issues and ditch the old-world mentalities of defunct data assumptions.

 

A Project for Equity in Data Science

All my years of dissatisfaction with theoften hidden bias in data science have culminated in a decisive step to address the problem. I’m proud to introduce We All Count, a project for equity in data science. We’re starting that conversation — the one the world so sorely needs — on how to uncover bias in data science.

Join us as we share examples, build tools, and provide training and education aimed at helping better understand data — so we can make it more equitable.

Who can be a part of We All Count?

Everyone. Thought leaders, computer scientists, high school students, technical workers, evaluation professionals, decision makers — everyone. Because when you do the math, we all count.

The Datassist blog will continue as it has, but if you want more stories that focus on equity, please sign up here for our weekly We All Count newsletter.

 

Recommended Posts

Leave a Comment

Start typing and press Enter to search

We're looking back at some of the methods for mapping data we loved in 2018.