Black College Students Are Leading the Movement to Eliminate Bias in Tech

During a conference exploring the civics of technology, members of the Ida B. Wells Just Data Lab shared how we can rethink technology to bring about social justice.

0
Code Running On The Laptop Screen
Extreme Close-up Photo of Codes on Screen Pexels

By Nadira Jamerson, Word In Black 

From self-driving cars that can’t detect folks with darker skin to keep from running them over, to digital assistants like Siri that have trouble understanding non-white accents, technology is biased and it is hurting Black folks.

“A lot of people will look toward technology as the end all, be all solution to a lot of social issues, but often social issues are not solved by technology, and technology often exacerbates these social issues,” says Cierra Robson, associate director of the Ida B. Wells Just Data Lab which brings students, educators, and activists together to develop creative approaches to data conception, production, and circulation.

Cierra Robson, associate director of the Ida B. Wells Just Data Lab. Courtesy photo.

Founded in 2018 and led by Ruha Benjamin, a sociologist and professor in the Department of African American Studies at Princeton University, the lab focuses on finding ways to “rethink and retool the relationship between stories and statistics, power and technology, data and justice.”

“Civics of technology derives from a lot of related concepts, but it’s about how we can use technology to further civic engagement, the democratic process, and social justice — especially anything that will galvanize a group of individuals to create social good,” Robson explains.

In her role at the lab, Robson works closely with Princeton students on a variety of projects that look at how technology bias is contributing to bias in all areas of our lives, from healthcare, to labor, and education.

Robson first became passionate about finding solutions to biased technology after learning how biased technology leads to violent over-policing.

“When I was an undergrad at Princeton, I had access to this entire wealth of resources that was kind of stuck in the university,” Robson says. “One of the biggest things that I wanted to do when the labs started in the summer of 2020 was figure out a way to get those resources from Princeton into the community, to people who needed them.”

Join Us on GodRadio.com

And people do need this information, desperately, because biased technology is killing Black and brown folks and contributes to higher rates of incarceration and injustice.

“Predictive policing technologies — there’s a whole bunch of them — but one of the ones I focus on a lot is that it predicts where crime is likely to happen in a given city, and that prompts police to go be deployed in those areas so that they can catch whatever crime might happen there,” Robson says. “What they base that data on is an algorithm that uses data on historic police interaction, but no one really stops to think that those historic police interactions are colored by all sorts of discriminatory processes.”

One of the best things that comes out of teaching students of all kinds about this work is that it ripples out in every single environment in our daily lives, whether that be the law, whether that be healthcare, whether that be worker justice and labor.

CIERRA ROBSON, ASSOCIATE DIRECTOR OF THE IDA B. WELLS JUST DATA LAB

Robson points out that a recent study conducted by Aaron Chalfin, a criminologist at the University of Pennsylvania, found that in Southern cities with large Black populations the homicide rate did not change when more police presence was added. But, more officers made arrests for low-level offenses like alcohol-related infractions, “which are not typically seen as contributing to public safety.”

“The fact is that Black communities are historically over-policed even before the advent of these technologies and algorithms,” Robson explains. “When you feed the data that focuses on arrests only in Black communities into an algorithm that predicts where crime is likely to happen, what you are going to get out of it is that crime only happens in Black and brown neighborhoods when that’s not true.

“As a result,” Robson says, “police are deployed overwhelmingly to Black and brown neighborhoods, and it creates this cycle where more data is being created because there are more police there. This obviously has negative impacts on people’s lives. From the waves of violent policing that we’ve seen for quite some time, it’s evident why you would not want police in your community all the time. There has also been chronic over-policing and under-protection. Just because police are in a neighborhood, does not equate to greater safety in that neighborhood.”

Through her work with the lab and the Civics of Technology conference, Robson hopes to inspire more students to ask critical questions about how data is sourced and how technology is used in Black and brown communities so that they can use their newfound knowledge to create better practices in whatever fields of work and study they choose to venture in to.

“A lot of them will end up in politics, in the tech industry, as lawyers, doctors, and all sorts of things,” Robson says. “One of the best things that comes out of teaching students of all kinds about this work is that it ripples out in every single environment in our daily lives, whether that be the law, whether that be healthcare, whether that be worker justice and labor.”

Participants in the Ida B. Wells Just Data Lab might go on to earn a doctorate degree or work in the tech industry, but that isn’t required. Robson says if they end up working in another industry entirely, she wants students to be “able to take some of the tools that we teach them about — about fair design practices and questioning what it really means to have something be objective, or questioning what it really means for something to be data-driven — into whatever area they’re going into in the future. Hopefully, if we can create enough students to do that, we are creating a new generation with a new awareness so that people are thinking twice about the technologies that they deploy and the data that they use in every area.”

To that end, in early August, the student members of the lab participated in Civics of Technology, a free, two-day virtual conference designed to bring the knowledge they’ve acquired while participating in the lab to the greater public.

Technology and Education Justice

During the virtual conference, Collin Riggins, a junior at Princeton and a research associate at the lab, and Payton Croskey, a senior at Princeton and creative content director for the lab, led “Reimagining Education Justice: Practices and Tools for Tech Freedom Schools” a workshop which focused on education justice — from early childhood to college and beyond the traditional classroom. Their goal was to determine how technology can be used to promote better education practices for diverse students if it is used properly.

“The theme for this summer’s convention is Freedom Schools,” Croskey says, “Freedom Schools is where all of the research groups and products stem from. We are rooting ourselves in history before we try to build something new for the future.”

Freedom Schools were created in 1964 as entirely new schools specifically designed for the education and advancement of Black students. Supporters of these schools believed in paying attention to and meeting the unique needs of each individual child. Bringing that concept into the present day, Croskey and Riggins say that if we want to eliminate bias in education, we must similarly listen to and respond to the needs of the communities we wish to serve with technology.

We need to shift to “thinking how we can build technology and community with those that the technology is seeking to serve,” Croskey says. “If we are building technology for young Black students in New York City, and we are saying that this is going to help them learn, then they also need to be part of that conversation and need to be included in that design.

We’re looking at the Freedom Schools which decided in the summer of 1964 to create entirely different schools for Black students so they could learn frameworks for how to resist and how to function in daily life.

 COLLIN RIGGINS, JUNIOR AT PRINCETON AND A RESEARCH ASSOCIATE AT THE LAB

“Technology is not going to be one size fits all,” Croskey says. “Especially in the education field, technology is going to need to be curated for a specific group and specific environments. Not pushing this one model that everyone needs to follow.”

“Although our goals are revolutionary, our work spawns from a long tradition of Black radical education,” Collins adds “We’re looking at the Freedom Schools, which decided in the summer of 1964 to create entirely different schools for Black students so they could learn frameworks for how to resist and how to function in daily life.”

“Across any institution, and even maybe across the world in general, there is a fixed approach to how you engage with technology,” Collins says. “One of the things the Lab does beautifully is allow people from different backgrounds and disciplines to come together in conversation. This has been very radical to me, especially at an institution like Princeton, which is very tech-driven and quantitatively driven. It’s nice to be able to engage with these concepts through art, or through storytelling, or through speculative fiction, and that not only be accepted but embraced. That inclusivity is rare.”

The two students have also used their time in the lab to focus on the use of surveillance in schools, which has significantly increased given the rising rates of school shootings. Although surveillance may prove useful in keeping some students safe from shooters, Croskey worries this will prove dangerous for Black students and students from other marginalized backgrounds.

“There is a lot of surveillance being used these days with the rise of school shootings. There is a lot of data being collected and a lot of tracking done on students who do not have the power to consent,” Croskey says. In addition, there are “parents who are not being given the power to truly consent because they are not being given full explanations about how this data is being used or where it will be sent to.”

Their hope is that technology can be reimagined in a way that is “curated for a specific group and specific environments. Not pushing this one model that everyone needs to follow,” Croskey explains.

Connecting Technological and Environmental Justice

Kenia D. Hale, a fellow at Princeton’s Center for Information Technology

It’s been a boiling hot summer with historic droughts ravaging the globe, but many people don’t often think about the connections between technology and environmental justice.

“When you look into it, there are a lot of ways that the technologies that we are using can be harmful to the environment,” says Kenia D. Hale, a fellow at Princeton’s Center for Information Technology. During the Civics of Technology conference, Hale, who is also a graduate of Yale University, led “Reimagining Environmental Justice: Practices and Tools for Tech Freedom Schools,” a session that explored the intersection of the two topics.

Hale says that although the energy needed for a single internet search or email is small, there are approximately 4.1 billion people, or 53.6% of the global population, who now use the internet, and the associated greenhouse gasses emitted with each online activity can add up. It turns out that the carbon footprint of our gadgets, the internet, and the systems supporting them account for about 3.7% of global greenhouse emissions. This is similar to the amount produced by the airline industry globally.

I wanted to figure out ways to challenge the idea that technology is automatically better for the environment, and spreading more awareness about the ways it can be quite harmful. People think there is no physical impact, but there is actually a lot of physical impact.

KENIA D. HALE, A FELLOW AT PRINCETON’S CENTER FOR INFORMATION TECHNOLOGY

“I wanted to figure out ways to challenge the idea that technology is automatically better for the environment, and spreading more awareness about the ways it can be quite harmful. People think there is no physical impact, but there is actually a lot of physical impact,” says Hale. “You can’t do anything without a laptop, so this isn’t to shame people into not buying one, but more so to spread awareness. Get engaged with the environmental organizations and activist groups that are in your city. It’s better to be more proactive in getting organized with our communities on how to collectively combat these things.”

Hale says that some questions that folks should be asking themselves when determining the environmental impact of a technological tool are who is mining the materials that go into your car, computer, or smartphone, and does the company that makes this product overly contribute to global pollution?

Learning More About the Effects of Technology on Our Lives

To learn more about how to spot technology bias and how to advocate for better data sourcing practices in your community, the lab’s research and resources page lists plenty of useful information.

In addition, the lab’s founder and director, Ruha Benjamin, has written extensively about the connections between technology and inequality. Her 2019 book “Race After Technology: Abolitionist Tools for the New Jim Code” explores how new technologies are framed as “benign and pure,” even though they perpetuate social inequities. The book, which was a 2020 winner of the Oliver Cromwell Cox Book Award (for anti-racist scholarship) from the American Sociological Association Section on Race & Ethnic Minorities also shares ideas on how we can combat these inequities.

In addition, “Black Power: The Politics of Liberation,” by Kwame Ture and political scientist Charles V. Hamilton, which defines Black Power, presents insights into the roots of racism in the United States and suggests a means of reforming the traditional political process for the future through technology and other tools.