Rethinking Bias & AI

How do biases in AI reflect Bias in society?
by Alison Gazarek

How do biases in AI reflect biases in society? How can we understand them to avoid repeating the mistakes of our past? Our current conditions can’t be understood without historical context; history is a series of choices, and, like AI, what we put in informs what we put out. How do we ensure we make better choices in the first place, rather than mitigating the ones we’ve already made? 

Brief History of School Integration in Seattle

In 1954, Brown vs Board of Education passed in the Supreme Court, effectively ending de jure segregation across the country. It took decades however, for cities to act on desegregation, and Seattle was no exception. Redlining and racist housing covenants led to unequal resources and segregated schools across our city. 

For years, local Black activists, churches, organizations, and students demanded integration in the form of boycotts, protests, and creation of Freedom Schools. In 1971, Seattle Public Schools became the first major school district to opt into desegregation without a court order, busing north end kids south and south end kids north. 

Citywide integration was an audacious idea that in order for education to be free and equal, resources should be shared across the district, and that school quality could not depend on the wealth and resources of a specific neighborhood. It acted as a forcing mechanism for the district to invest in schools equally. Of course, this wasn’t perfect: the burden often rested on families of color to effectively assimilate to white-dominant schools, and white families left the district in droves. 

Today, Seattle Public Schools is almost as segregated than it was 40 years ago. In 2007, a group of parents took their case all the way to the Supreme Court, arguing that the race-based tiebreaker used in school assignments intended to diversify schools violated their rights and was in fact a form of discrimination in itself, and desegregation in Seattle Schools was effectively ended.   

AI and Bias

What does this have to do with AI and innovation? Well, like history, AI is also only as good as its inputs. It is a technology created to emulate human reasoning and thinking patterns, but the data we input to populate the large language models (LLMs) we use predetermines the outputs. The largest LLMs are trained on widely available internet content, books and other text-based sources, which is overwhelmingly male, anglocentric, and generally biased. 

We asked DALL-E to create an image of someone writing an article about AI and bias at a consulting firm in Seattle. As you can see, although the leader of our iF education team is a woman, the LLM assumes a white man. . . But it’s not the AI itself that is biased – it’s the humans who created the inputs in the first place. 

Most of the conversation around equity and AI has been around ‘mitigating bias.’ While it’s a start, mitigation is about cleaning up a mess – not preventing it in the first place. Creating inequitable AI models and or AI technology and then asking how to make them less inequitable is absurd.  We have to create inputs for AI and AI-enabled products that are as inherently nuanced, diverse, and complicated as we are.

Inputs predicate the outputs, and can create a more representative, free, and equal experience in the first place. One example of a product like this is Latimer, an LLM striving to better represent the experiences of Black and brown people, which is being trained on data like oral histories, indigenous folk tales, and local archives: starting with inputs that create more representative outputs.

We find ourselves at an urgent inflection point, where, like in history, we can decide to intentionally create the future we want rather than cleaning up the choices of our past. We can choose to center students in the development of technologies that will affect them, and train future models to reflect the rich diversity of human experience in ways that we’ve failed to in the past. 

The question isn’t how we mitigate bias in AI; it’s how we create responsible technology in the first place.

Interested in hearing more about our work in Education? Please write to Alison Gazarek, Director of Education, at alisong@intetionalfutures.com.

Sources

Why Seattle schools are more segregated today than the 1980s” by Dahlia Bazzaz, The Seattle Times

The Seattle School Boycott of 1996” by Brooke Clark, The Seattle Civil Rights and Labor History Project

Seattle schools chose integration. But then it fell apart” By Dahlia Bazzaz, The Seattle Times

Back to top