Women
Sen. Akilah Weber Pierson, M.d. Credit: California Black Media

By Sen. Akilah Weber Pierson, M.D., Special to California Black Media Partners 

Technology is often sold to us as neutral, objective, and free of human flaws. We are told that computers remove emotion, bias, and error from decision-making. But for many Black families, lived experience tells a different story. When technology is trained on biased systems, it reflects those same biases and silently carries them forward.

We have already seen this happen across multiple industries. Facial recognition software has repeatedly misidentified Black faces at far higher rates than White faces, leading to wrongful police encounters and arrests. Automated hiring systems have filtered out applicants with traditionally Black names because past hiring data reflected discriminatory patterns. Financial algorithms have denied loans or offered worse terms to Black borrowers based on zip codes and historical inequities, rather than individual creditworthiness. These systems did not become biased on their own. They were trained on biased data.

Healthcare is not immune.

For decades, medicine promoted false assumptions about Black bodies. Black patients were told they had lower lung capacity based on race, and medical devices adjusted their results accordingly. That practice, despite lacking scientific justification, was not broadly reversed until 2021. One of the clearest examples that harmed millions of Black patients involved kidney care. Up until 2022, a common medical formula used to measure how well a person’s kidneys were working automatically gave Black patients a higher score simply because they were Black. On paper, this made their kidneys appear healthier than they truly were. As a result, kidney disease was sometimes detected later in Black patients, delaying critical treatment and referrals.

These biases were not limited to software or medical devices. Dangerous myths persisted that Black people feel less pain, contributing to undertreatment and delayed care. These beliefs were embedded in modern training and practice, not distant history. Those assumptions shaped the data that now feeds medical technology. When biased clinical practices become the foundation for algorithms, the risk is not hypothetical. The bias can be learned, automated, and scaled.

We are seeing this happen in real time. 

Across the country, some medical AI algorithms have underestimated how sick Black patients are because they used past healthcare spending as a proxy for medical need. Since Black patients historically received less care and fewer resources, the system overestimated their health because it relied on biased data. That meant fewer referrals for specialty care, fewer preventive interventions, and delayed treatment.

For us in the Black community, this creates understandable fear and mistrust. Many families already carry generational memories of medical discrimination, from higher maternal mortality to lower life expectancy to being dismissed or unheard in clinical settings. Adding AI biases could make our community even more apprehensive about the healthcare system.

As a physician, I know how much trust patients place in the healthcare system during their most vulnerable moments. As a Black woman, I understand how bias can shape experiences in ways that are often invisible to those who do not live them. As a mother of two Black children, I think constantly about the systems that will one day shape their health and well-being. As a legislator, I believe it is our responsibility to confront emerging risks before they become widespread harm.

That is why I am the author of Senate Bill (SB) 503. This bill aims to regulate the use of artificial intelligence in healthcare by requiring developers and users of AI systems to identify, mitigate, and monitor biased impacts in their outputs to reduce racial and other disparities in clinical decision-making and patient care. 

It is part of California’s effort to ensure that, as artificial intelligence becomes more deeply embedded in healthcare and society, innovation continues to move forward responsibly. Currently under consideration in the State Assembly, SB 503 was not written to slow innovation. In fact, I encourage technological advances in medicine. However, we must ensure that any new tool we introduce into the healthcare field helps patients rather than harms them.

But this conversation is bigger than any single bill.

It is about whether we allow human bias to migrate into code and quietly shape medical decisions for years to come, or whether we intentionally build systems grounded in fairness and accountability.

Every community deserves a healthcare system where decisions are based on need, not stereotypes or flawed data. Every patient deserves to be seen accurately, treated fairly, and cared for with dignity. 

If we want a healthcare system worthy of our communities, we must demand that it be free of bias and grounded in fairness.

The health of our families, our children, and our future depend on it.

About the Author 

Sen. Akilah Weber Pierson (D–San Diego) is a physician and public health advocate representing California’s 39th Senate District. A former Assemblymember, she focuses on advancing health equity, strengthening healthcare systems, and protecting vulnerable communities across the state.