Using Data Science to Highlight Societal Inequities
Hertz Fellow Emma Pierson wields machine learning like a Swiss Army knife to investigate a range of problems, including disparities in COVID-19 testing, the treatment of osteoarthritis, and police discrimination.
“The problems I work on, like inequality in health, are genuinely emotionally compelling to me,” Pierson said.
Currently a senior researcher at Microsoft Research New England, Pierson will take a position this summer as an assistant professor of computer science at Cornell Tech, Cornell University’s new campus in New York City.
Pierson’s ultimate goal is to improve human lives, but she acknowledges a huge gap between academic publication and that objective.
“When you do occasionally hit that high bar, that’s nice. It can be as simple as someone saying, ‘Your paper meant something to me because it resonated with my personal experience with women’s health,’ but it can also be something more sweeping like policymakers changing the way they do something in the world.”
Racial disparities and police stops
Policymakers took notice last year when Pierson and her colleagues published “A large-scale analysis of racial disparities and police stops across the United States.”
The team collected data (and released it to other data scientists) on nearly 100 million traffic stops conducted across the country and determined that Black drivers were less likely to be stopped after sunset, when a “veil of darkness” masks one’s race, suggesting bias in stop decisions.
The research team helped journalists from the LA Times use their methods to show that the Los Angeles Police Department was searching Black and Hispanic drivers on the basis of less evidence than white drivers.
In response to the LA Times investigation, the police department changed their policies.
COVID-19 testing and mobility
Pierson became interested in whether the methods they used to study police searches could be applied to health.
Late last year, she published a paper showing that the same math she used to study inequality in police searches could be applied to inequality in COVID-19 testing. The evidence suggested that Black patients are being tested only when they are much more likely to have COVID-19, at a higher threshold of evidence.
“In the case of a police search, the purpose is to find contraband, and in the case of COVID-19 testing, the purpose is to find people infected with the virus. The same math applies because the settings are parallel," Pierson said. "You’re testing for whether some racial groups face different thresholds for having a police search or for getting tested for COVID. In both cases, we’re testing for an inequality in thresholds, which is a hallmark of discrimination.”
In a related study, she and her colleagues analyzed mobility data from 98 million cell phones in the United States from March to May 2020. (The team is updating the analysis now on more recent data.) The anonymized, aggregated data described how many people go to a given store at a given hour and approximately what neighborhoods they come from. Layering a model of COVID-19 spread on top of the mobility data correctly predicted—from mobility data alone—that poorer neighborhoods and less white neighborhoods saw higher rates of COVID-19 infection.
“People in poorer and less white neighborhoods weren’t able to reduce their mobility as much after the pandemic started,” Pierson said. “This makes total sense in light of what we know about essential workers and who’s able to easily work from home and how that’s correlated with socioeconomic status and race. The other reason is that when they do go out, they go to places that are more crowded and therefore more dangerous.”
One implication of these results is that policymakers can’t just consider the impact of reopening policies on the population as a whole; they have to think about how they will impact these disadvantaged groups, she said. “It’s not necessarily as a result of our work, but the Biden administration clearly has this top of mind, which is good,” she said.
Inequalities in osteoarthritis treatment for knees
Unequal thresholds also undergird Pierson’s 2021 paper on the treatment of Black and white patients with knee osteoarthritis, a common cause of disabling pain in older adults. Assessing the severity of knee damage helps doctors prescribe the right treatment, including physical therapy, medication, or surgery.
“Black patients in our data reported higher pain levels,” Pierson said. “This gap persists even when we controlled for how severe the doctor thought the disease was as measured by an X-ray of the patient’s knee.”
Traditionally, a radiologist reviews an X-ray of the knee and scores the severity of knee disease using the Kellgren–Lawrence grade (KLG), which assesses the presence of different radiographic features, like the degree of missing cartilage or structural damage. But the KLG scale was developed decades ago with a white British population.
“It’s plausible that the scale doesn’t capture all the factors relevant to pain in modern and more diverse populations who may live and work very differently and have a very different set of occupational and environmental risk factors,” Pierson said.
Theorizing that the pain gap exists because of knee damage clinicians are not trained to look for, Pierson and her colleagues trained an algorithm to listen to a patient’s reported pain and find knee features relevant to pain that are missed by the doctor’s risk score. They found the algorithm was able to find knee features relevant to pain that were missed by the doctor’s severity score and that disproportionately affected Black patients. Because severity scores influence treatment decisions, algorithms could potentially reduce disparities in access to knee surgery by identifying knee damage missed by traditional methods.
In machine learning algorithms, it’s important not to just replicate the doctor’s clinical judgment, if that judgment is incomplete or biased, she noted, but to learn from the patient as well.
Impact of the Hertz Fellowship
Pierson became interested in computer science in college. “I got bad medical news that I carried a genetic mutation that ran in my family and conferred a high risk of cancer. And so that was very concretely motivating, although I don’t necessarily recommend the experience.”
Instead of pursuing physics as she’d planned, Pierson switched gears and did a master’s in statistics at Oxford University on a Rhodes Scholarship. She returned to the United States for her doctorate in computer science.
“The Hertz Fellowship gave me the freedom to innovate. As my interests shifted, I worked with a whole range of people,” she said. She aligned with Jure Leskovec, associate professor of computer science at Stanford University, who gave her freedom to study inequality in healthcare. She and her colleagues wrote three papers on women’s health and the menstrual cycle while she was in graduate school.
“This was not a standard computer science topic. This is something that we need to talk about as a fundamental aspect of the health of half the global population,” Pierson said. “Women’s health is historically understudied and women, as a population, have historically sometimes been explicitly excluded from medical studies because they’re too ‘volatile.’” One of Pierson’s next interests will be intimate partner violence.
Pierson acknowledges that there are limits to data science tools and that “these predictive superpowers don’t automatically make the world a better place. Often the most vulnerable groups don’t have data collected on them. People are becoming much more aware how these algorithms can also exacerbate inequalities, and that’s as it should be.”