
Unintended consequences are old news; so why are we surprised every time it happens? The only remedy is monitoring for harms and fixing problems when we find them. It sounds simple, but it’s very rare.
Computer programs developed to target care management services to people with serious complex conditions were meant to make important health decisions based on data, but instead it favored white patients over sicker black patients.
Algorithms are formulas used by computer programs to analyze big data to help health systems make better decisions. As algorithms are based on evidence and data, rather than referrals and best guesses from humans, the system should be more fair and more efficient in targeting precious resources to exactly where they are needed. They are an important tool to help providers improve care and, in theory, should reduce racial inequity.
Often the algorithms identify patients with previous high healthcare spending as a proxy to predict who could benefit from care management services. It makes sense. Care management is a very successful but expensive tool widely used to improve outcomes, control costs, and improve satisfaction for high-risk patients. In the usual course of health policymaking, things would continue that way indefinitely, assuming all is well.
Thankfully, a group of researchers followed up to see what really happened under one widely-adopted algorithm that is applied to about 200 million Americans each year. They found that instead of reducing racial and ethnic disparities, the new systems were making things worse.
Black patients with complex health needs in the study got less care than whites with the same level of health (average $1,801 less). Black patients also have more severe health conditions than whites but were receiving the same algorithmic score.
It’s likely that lower access to care by race, discrimination, and minorities’ higher mistrust of the healthcare system resulted in less care to black patients. By using previous health costs as an input, the algorithm associated blacks’ lower costs with less need for care management.
Consequently whites were over twice as likely to be offered care management services than blacks. Unfortunately, this put more black patients at risk of worsening health problems, ER visits and hospitalizations. The algorithm never used race as an input to the calculation. It was not overtly discriminatory, but discrimination happened anyway.
The researchers developed fixes to the algorithm that mitigated the disparities, more than doubling the percentage of black patients benefiting from care management. The researchers are also offering to help other developers fix their algorithms for free – which is remarkable because these are very expensive proprietary programs.
This is only the latest example of policies that are meant to make things better but end up failing. PCMH Plus is Connecticut Medicaid’s most recent example. Regular and robust evaluation is critical after even well-intentioned programs and policies are implemented. Then there must be the political will to fix or end programs that don’t work. That last may be the bigger hurdle. But it must become standard operating procedure, if we are to fix our costly, broken healthcare system.
Ellen Andrews, PhD, is the executive director of the CT Health Policy Project. Follow her on Twitter @CTHealthNotes.
DISCLAIMER: The views, opinions, positions, or strategies expressed by the author are theirs alone, and do not necessarily reflect the views, opinions, or positions of CTNewsJunkie.com.