Home Latest Very Little Is Keeping Doctors From Using Racist Health Formulas

Very Little Is Keeping Doctors From Using Racist Health Formulas

0
Very Little Is Keeping Doctors From Using Racist Health Formulas

[ad_1]

Recently, two leading medical associations recommended ending a decades-old practice among doctors: using race as one of the variables to estimate how well a person’s kidneys filter waste out of their bodies. Before, clinicians would look at the levels of a certain chemical in blood, then multiply it by a factor of approximately 1.15 if their patient was Black. Using race to estimate kidney function contributes to delays in dialysis, kidney transplants, and other life-saving care for people of color, especially Black patients.

To make the recent decision, 14 experts spent approximately a year evaluating dozens of alternative options, interviewing patients, and weighing the impact of keeping race in the equation. Their final recommendation ensures the corrected kidney equation is equally precise for everyone, regardless of race.

Yet other risk equations that include race are still being used—including ones that have been used to deny former NFL players’ payouts in a concussion settlement, ones that might contribute to underdiagnosing breast cancer in Black women, and ones that have miscalculated the lung function of Black and Asian patients. Ending the use of race-based multipliers in these and dozens of other calculators will take more than a task force in one medical specialty. It’ll need researchers to not just believe, but act on the knowledge that race is not biology, and for the biomedical research enterprise to implement clearer standards for how these calculators are used. Otherwise, it’s just a matter of time before another tool that wrongly uses race to make decisions about patients’ bodies trickles into clinical care.

Physicians have relied on risk calculators, which help doctors make quick decisions in the face of uncertainty, for over four decades. Many doctors tend to stick with the versions they first heard about while in medical school or completing their residency, says California-based ER physician Graham Walker. That kidney function equation that was just updated? Many clinicians still use a much older version that doesn’t include the correction. That ancient version, first developed in 1973, is still the most popular equation on MDcalc, a website and smartphone app that Walker and his cofounder, Joseph Habboushe, developed to curate risk calculators and make them easily accessible to clinicians. While they don’t track users closely, usage statistics and a 2018 survey suggest that about 68 percent of doctors in the US use MDCalc at least every week.

And given that scientists have used race to distinguish between people long before modern medicine, it’s not surprising that when risk calculators were developed, race became a part of many equations.

In the kidney function equation and many others, race became a stand-in for differences in the measurements of some biomarker or other that researchers saw among their study participants, who were usually either white or Black. The observed differences are biological. But they are the result of health disparities caused by racism, not a result of race itself. They might also be mere statistical blips, because a study didn’t include sufficient numbers of Black participants.

And while kidney function equations in the US included a multiplier for being Black, similar calculators in other parts of the world were developed to include “Chinese” or “Japanese” coefficients. In the US, non-Black people of color have found their doctors averaging the Black and non-Black values to estimate their kidney function, or simply defaulting to the “normal”—usually the estimates for white individuals.

Scientists developing these types of calculators often rely on long-running databases from the CDC that include a column with demographic details next to biological statistics such as weight or disease stage. Because that demographic information correlates with differences in disease incidence, severity, or death rates, multipliers for race or ethnicity have become a convenient proxy for the unknown, underlying reasons for these differences. The collective burden of this practice is tough to estimate, because, outside of numbers such as those from MDcalc, it’s impossible to know how many times a risk calculator is used, or how every individual doctor uses the results to guide care for each patient. Still, it’s clear that risk equations being developed today still include race as a factor.

Yet there is another way. In November 2020, researchers developed a new risk calculator named the VACO index to predict the odds of dying a month after a positive Covid-19 test. They used data from the Veterans Affairs health care system, which closely tracks not just a person’s race but also preexisting illnesses that might affect the course of a Covid infection. Once the developers included variables to represent an individual’s age, gender, and chronic conditions such as hypertension, race didn’t matter—the race-free equation worked equally well for everyone in the study.

One explanation for why race does not improve the equation’s accuracy, the researchers suggest in a podcast, is that patients in the VA system experience fewer barriers to accessing care. Disparities in health outcomes are often the result of systemic hurdles and unequal access to health care. With fewer barriers, the seemingly race-based difference in risk of death was minimized. Another possibility is the medical history the developers had at hand, which could explain the underlying biology of the disease itself instead of relying on race as a proxy. “Both theories [about the VACO score] argue that Covid may seem worse in underserved populations because we don’t properly know about chronic conditions in these populations or other social determinants of health,” Habboushe says. “It’s not specific to a checkbox of race itself.”

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here