About

What is Data Bias?

The infrastructure for this network didn't exist. So we're building it.

The Idea

Data Bias began as a research project during an MA in Digital Culture and Society at King's College London. The original question was simple: where do you go to find Black women and gender-non-conforming researchers working on AI, algorithms, and data bias outside the United States?

The answer was: nowhere. No single place. You had to already know the names, already be in the rooms, already have the network. Which meant that the researchers doing the most important work were the hardest to find — and the hardest to fund.

Data Bias exists to fix that.

What We Do

We maintain a free, publicly searchable database of women of colour working in:

Profiles include research focus, affiliations, publications, and (where researchers have opted in) information about funding needs and collaboration interests.

Our Framework

Data Bias uses the Black Women Best Framework (Janelle Jones, Roosevelt Institute, 2020) as its core operating principle. The BWBF holds that when we design policies, platforms, and systems for those at the sharpest intersections of oppression — Black women, in this case — we build infrastructure that works for everyone. By centering the most marginalised, we create more robust, more equitable, more resilient systems.

This is not a platform that treats diversity as a metric. It treats the knowledge, research, and expertise of women of colour in AI as the primary resource the field needs right now.

What We Are Not

The Gap We Fill

Organisations like Women in AI Governance, Black in AI, and the Data Feminism Network do important work. But none specifically centres women of colour globally, and none provide a searchable public archive of individual researchers' work, affiliations, and funding needs. Data Bias fills that gap — particularly for researchers in the Global South, in the UK, and in communities not centred in mainstream AI discourse.

Get Involved

If you are a woman of colour working in AI safety, governance, or algorithmic justice — anywhere in the world — we want to include you. Email us or use the submission form on the catalogue page.

If you represent a funding organisation, research institute, or conference and want to partner with Data Bias, please get in touch.