Like most of us…

...Maya* now lives in a world filled with decisions made by Artificial Intelligence.
*Maya is a fictional character. But the information below is factual and informed by expert interviews.

From news suggestions…

To calculating insurance premiums…

To controlling the power grid…

To suggesting music… even creating it.

Alvi’s right - there are loads of stories about how AI bias is affecting real people.

AI uses statistics and probability with big datasets to analyse, learn, and make predictions.

Afua Bruce, an author and founder of a company that helps to develop responsible technology, says those datasets are often just us.

It’s our data that we’re all giving away.

It’s scraped off the internet, then AI generalises based on what’s in there.

But that data is packed full of the historical human biasses that we already see in our lives.

“Racism, sexism, ableism, all the isms, they weren’t created with the internet, they weren’t created with AI … they already exist,” says Ms Bruce.

“One of the things AI does is sort of reflect back and extrapolate what we have already told it we are,” she says.

So, if the datasets already show higher crime rates among a group, then the AI learns from this data.

It then makes decisions about people from that group, often perpetuating cycles of inequality.

But this isn’t always so obvious to see.

Two thirds of IT professionals self-report in 2023 that their AI companies are struggling with data bias.

But only 13% globally say they’re currently addressing these problems.

Reid Blackman, CEO of AI ethical risk consultancy Virtue says very few organisations are even aware bias is deeply embedded in their software or datasets.

“Most companies aren’t doing anything about the bias issue,” says Reid Blackman

We already know AI is helping to inform…

Job & School applications…

AI programs built on biassed data are everywhere, and many are worried they are reinforcing inequality and injustice in our societies.

Read more

Credits

  • Reporter & creator: Carl Smith
  • Illustrator & visual storytelling: Ing Lee
  • Web developer: Stefan Auerbach
  • Scientific advisor: Professor Didar Zowghi, Science Team Leader – Diversity and Inclusion in Artificial Intelligence, CSIRO Australia
  • With thanks to Ítalo Carajá.

This project was supported and first published by the MIP.labor program.