Showing that traditional reporting can still retain its power, an exposé of an errant artificial intelligence-based algorithm has spurred a class action suit against UnitedHealth Group, operator of the nation’s largest Medicare Advantage plan.
The suit, spurred by a series of STAT investigative stories, accuses UnitedHealth of deploying an algorithm with a known high error rate in order to deny costly rehabilitation care to seriously ill patients. (United says the lawsuit “is without merit.”) In May a STAT story and an ongoing ProPublica investigative series, focusing on insurer claims denial tactics that in some cases have allegedly involved circumventing state laws mandating coverage of fertility or cancer treatments, sparked a Senate hearing.
Meanwhile, the New York Times and KFF Health News are investigating “the financial and emotional toll of providing and paying for long-term care” in a joint series entitled “Dying Broke.” And The Washington Post, addressing the question of “Why are so many Americans dying early?”, has begun examining the roles played by politics, stress and chronic illness.
At a time of splintered information sources and banged-up media budgets, traditional reporting has nonetheless managed to remain relevant. Recently, the two STAT reporters investigating the alleged algorithm abuse, Casey Ross and Bob Herman, described their work on a webcast moderated by their editor, Lison Joseph. Listening to them, as someone who regularly did investigative reporting back in the (prehistoric) pre-Internet and pre-smartphone era, I was struck by how some basic journalistic principles have remained constant.
How An Investigation Begins
Good investigative reporters are skeptics. Investigative reporting takes a prodigious amount of time and effort, while simultaneously being fraught with legal and reputational hazards. So when a post-acute care provider reached out to Ross and Herman with concerns about coverage decisions, the two initially took the complaints “with a big grain of salt.” After all, those providing nursing home and rehabilitation care have a financial interest in getting paid, and both industries have their own checkered histories.
But this call was different. The clinician, Ross recalled, “had a shouting-at-the-wall level of anguish” at the havoc being wreaked by an AI algorithm being used for coverage decisions. At about the same time, in a stroke of good fortune, the Centers for Medicare & Medicaid Services was soliciting comments on MA plans. A number of those public-record comments, said Herman, came from individuals “ similarly infuriated” at what had happened to them, family members or clients.
That widespread level of concern signalled this was a problem worth probing. Moreover, it was a high-stakes problem not just for vulnerable individuals – a cornerstone of journalistic storytelling – but also for public policy. It involved taxpayer-funded MA plans, which covered more than 31 million Americans, and an emerging technology, AI, whose workings often remain inscrutable even as its influence explodes.
Ross and Herman’s first story, on March 13, 2023, was headlined, “Denied by AI: How Medicare Advantage plans use algorithms to cut off care for seniors in need.” It began with the heart-rending tale of an 85-year-old Wisconsin woman with a shattered left shoulder and an allergy to pain medicine who was pushed out of her nursing home to an apartment where she lived alone despite severe pain and an inability even to go to the bathroom without help. Yet, they wrote, an “unregulated predictive algorithm” from NaviHealth, owned by UnitedHealth, declared her ready for discharge. (NaviHealth responded that coverage decisions are based on Medicare criteria and the patient’s health plan.)
Allegations Broaden and Deepen In Importance
As often happens, the first exposure of alleged wrongdoing opened the floodgate for others to come forward. Case managers and physician reviewers at NaviHealth were “traumatized,” said Ross, by the consequences for patients of a job performance target saying discharge decisions shouldn’t deviate from the algorithm’s recommendations more than one percent of the time. Conversations with these sources “felt like confessions,” said Ross. “These people…have a life mission to advocate for people,” but faced a choice between fulfilling that mission “and keeping their job.”
In a July 13 story headlined, “How UnitedHealth’s acquisition of a popular Medicare Advantage algorithm sparked internal dissent over denied care,” Ross and Herman related how the algorithm’s use changed after NaviHealth was acquired by UnitedHealth’s Optum division. Clinical judgment abruptly gave way to following the algorithm, one medical reviewer told them.
But the stories also started to focus on the broader implications. What the UnitedHealth example demonstrated, wrote STAT, was the “potential dark side of artificial intelligence” as “large corporations gain control of increasingly powerful and largely unregulated AI tools.” Three months after that story ran, UnitedHealth said it was discontinuing the NaviHealth name as part of a rebranding. The STAT stories, however, didn’t stop.
A Nov. 14 article carried the stark headline, “UnitedHealth pushed employees to follow an algorithm to cut off Medicare patients’ rehab care.” Lengthy additional reporting included the allegation that “some of the AI-guided decisions also may sidestep” Medicare coverage rules and the revelation that CMS was looking into the questions raised by STAT. The class action suit was filed at roughly the same time.
Journalists, Accountability and AI
Investigative journalism has long been a powerful tool for spotlighting health care abuses, be it physician greed in the unregulated 1950s, the suffering of the uninsured elderly and poor in the 1960s, rampant unnecessary surgery in the 1970s and onward to more recent issues such as surprise medical bills and predatory drug pricing. Sometimes fundamental change results, as in the passage of Medicare and Medicaid. Other times – far too often – abuses recede only to crop up again a few years later in a slightly different form.
As a judge for the annual NIHCM journalism awards I’ve been able to see some of the “phenomenal, groundbreaking work” being done today, as veteran investigative journalist Fred Schulte phrased it. Still, investigative reporting has limits.
One is frustratingly obvious. Journalism may be a “truth seeking profession,” as described in the 1987 book The Virtuous Journalist, but those truths can prompt change only if they motivate those with real power to act on them. While Ross and Herman said their goals were “clear answers” and “accountability” for how the UnitedHealth algorithm was trained and deployed, it remains to be seen whether their revelations will rev up enough pressure to achieve those aims.
The second limitation is more subtle, but equally important. A compelling investigative story depends upon clear “good guys and bad guys.” Reality is more complex. Consider a recent Lown Institute study on unnecessary coronary stents. It concluded that a Medicare patient receives an unnecessary stent every seven minutes, costing the government hundreds of millions of dollars annually. Dr. Vikas Saini, a cardiologist and president of Lown, called the practice “dangerous” to patients, but attributed it to physicians “struggling to keep up with the evidence.”
That assessment strikes me as deliberately generous; however, pinning the entire problem on greed, with good guys and bad guys, would be equally simplistic. Maybe what the doctors, hospitals and insurers all need is an algorithm everyone can trust.
Can journalists help bring accountability to health care AI? Perhaps as a hopeful harbinger of the future, Ross, STAT’s national technology reporter, expressed frustration in the webcast that UnitedHealth wouldn’t put him on the phone with one of their data scientists to explore the algorithm in detail. As our society moves into a murky future of often-unconstrained AI, shining the spotlight of investigative journalism on the decisions made by machines may yet become as commonplace as doing the same for decisions made by people.
Read the full article here