PROGRAMMING NOTE: Morning eHealth will not publish Monday, October 12. We’ll return to our normal schedule on Tuesday, October 13. In the meantime, please continue to follow POLITICO eHealth.
PANACEA OR PITFALL?: The Hope, the Hype, the Promise, the Peril. That’s the subtitle of a National Academy of Medicine report on health and artificial intelligence. And it’s a good starting point to size up how software, machine learning and complex algorithms are changing the medical system in the middle of a pandemic.
Everybody celebrates the hope and promise; if you google “Covid and AI” you get something like 2.7 million hits.
But it’s easy to gloss over the hype and the peril. If AI is built on data scaffolding that is tainted, flawed or distorted by systematic bias — or just the wrong starting points — those inequities will be baked in and perpetuated.
These topics are on our mind before POLITICO’s annual AI summit next week. It’s timely, when we so desperately need to make sense of things, when we so desperately need scientific breakthroughs — and when we so desperately need tools that will address inequities and foster trust.
“Good intentions can go wrong,” UC Berkeley’s Ziad Obermeyer, who researches the intersection of machine learning, medicine and health policy, told a recent ABIM Foundation conference on health care and trust. Obermeyer, who will be a panelist at our summit, cited an algorithm that conflated health care costs with health care needs — making it appear as if larger sums spent on or by white patients meant they needed more care or were in worse health.
Because of the backward logic, Obermeyer said, “Black patients were prioritized lower than healthier white patients because of lack of access to care and discrimination translating into lower health care costs.”
That algorithm was fixable. But you’ve got to recognize that it needs fixing.
Making sense of the data: From the outset of the pandemic, AI has been touted as a potential game changer. Perhaps it can shore up our lackluster (at best) contact-tracing efforts by identifying — and predicting — cases of Covid-19. It’s being used to analyze medical research, extract useful nuggets and map out testing sites. It’s being deployed to speed up drug research, analyze images to detect pneumonia, or figure out who is at high risk for terrible outcomes. Some cutting-edge work aimed at gaining deeper understanding of the coronavirus is being led by the National Institutes of Health.
There is much promise. Data that now sloshes around the health care system can flow into a “collective memory” that can “inform and improve the subsequent care of the individual patient and the health system more generally,” National Academy experts wrote. And there’s information pouring in from wearables, other devices and patient-reported outcomes.
If it all works, said the National Academy report, the advances will at long last be payback (the good kind) for all the time, expense and aggravation that health care providers have spent wrestling with electronic health records.The report was done before the pandemic took hold, but the overall assessment is only more relevant now.
We also live in an era where social justice — including the inequities in health care that the coronavirus has only intensified — is front and center. It’s essential to ask if AI will close gaps or widen them. Will technology enhance trust — or crush it?
Hope. Hype. Promise. Peril.
RACE AND ALGORITHMS: The decades-old practice of adding race — or simply whether or not a patient is Black — to algorithms used to diagnose and treat health conditions is drawing renewed scrutiny in the Black Lives Matter era. And as physicians and medical students push for changes, POLITICO’s Katy Murphy writes some medical centers such as at the University of Washington have are beginning to ditch the race-based adjustments.
This summer, as protests against police brutality and racism erupted around the nation, the National Kidney Foundation and the American Society of Nephrology announced a joint task force on the use of race in the field. The societies acknowledged that Black patients were less likely to be identified as kidney transplant candidates than white patients and pointed to a tool used to detect kidney disease, the estimated glomerular filtration rate, which commonly factors in a patient’s race as well as age, weight and gender. In bolded font, the groups wrote that unlike age and weight, “race is a social, not a biological construct.” The panel’s recommendations are due out later this year.
Congress has also chimed in. House Ways and Means Chairman Richard Neal (D-Mass.) appealed last month to medical societies to examine the use of race in clinical decision tools. He cited a June study in the New England Journal of Medicine that identified 13 clinical algorithms using race in fields from obstetrics to oncology.
Welcome back to Future Pulse, where we explore the convergence of health care and technology — and how innovations are changing medical care and consumer choice. Share your news tips and feedback with us: @dariustahir, @ravindranize, @stevenoverly, @ali_lev.
Alyssa Canobbio Hackbarth @AlyssaEinDC “Holy crap. Telemedicine is awesome. Less than 5 minutes on the phone with a doctor and my Rx is already being sent to my pharmacist.”
HOSPITALS THREATENED FOR NOT REPORTING DATA: The Trump administration is getting tough on hospitals that don’t provide updated coronavirus data to the government daily, nearly three months after it abruptly rolled out a controversial new reporting system, POLITICO’s Rachel Roubein writes.
By mid-January, hospitals that don’t regularly report key information can be kicked out of Medicare and Medicaid, their major funding sources. Hospitals must give daily updates on information such as the number of patients hospitalized with Covid-19. They’re also required to file weekly reports on critical resources, like their supply of N95 masks and gowns.
The beefed-up reporting mandate follows the troubled roll-out of a new reporting process that replaced a system used to send information to the CDC. Democrats argued the move sidelined the CDC amid the pandemic, while the administration maintained the new system was more nimble.
Hospitals say their Medicare and Medicaid funding shouldn’t be put at risk. The funding threat “remains an overly heavy-handed approach that could jeopardize access to hospital care for all Americans,” said Rick Pollack, the president and CEO of the American Hospital Association.
QUESTIONS FOR HHS’ CIVIL RIGHTS OFFICE DIRECTOR: Roger Severino has broad oversight over health care data — starting with enforcing HIPAA, the nation’s health privacy law. So his Office for Civil Rights often gets involved when hospitals are hacked, or when patients try to get a copy of their medical records and hit roadblocks. Severino spoke with POLITICO’s Darius Tahir about his work. The interview has been edited for length and clarity.
Your office recently reached a series of settlements with providers over alleged negligence in building defenses to hacking. How do you see the cat-and-mouse game between hackers and data-holders evolving?
We’ve seen hacking grow to become the number one cause of large breaches reported to OCR. We’re up to over 60 percent of large breaches stemming from hacking incidents. It’s a significant change over the past several years, and we as regulators have to adapt to circumstances.
Part of our job as regulators is to hold entities accountable that don’t take the basic steps to comply with the HIPAA rules. What we’re talking about, in many instances, is things that can be done with just foresight and attention, and sensitivity to the issues.
We’re still going after the low-hanging fruit in terms of violations. We had one entity that was notified by the FBI that had traced a hacking group’s attack to their own information systems. Yet they didn’t take action to stop the continued exfiltration of the data. When the FBI tells you, “Hey there’s a problem here,” and you don’t take proper corrective action, you should expect some HIPAA enforcement action — and rightly so.
Your office has previously made it a point to enforce regulations mandating patients have affordable access to their medical records. Have you seen improved compliance from health care providers?
It’s too soon to see overall. However, we knew there was large-scale non-compliance, which is why we launched the initiative last year.
We had done one resolution in the previous, I believe, 10 years in the Office for Civil Rights. And now we’re up to eight. Which is a tremendous change and underscores a message — if you do not provide timely access to medical information at a reasonable cost, you’re risking an enforcement action.
I personally know what it’s like, the frustration people feel. When transferring doctors, you need to be able to move those records around and get access to them when you need them most. The ability to transfer and produce it has gotten easier, as it’s moved to electronic health records.
The excuses for not providing a person’s medical records are falling away, so we’ve moved to enforcement to ensure that the bureaucratic inertia is shaken off, and people can get access to the records when they need them, at a reasonable cost.
Welcome to POLITICO’s newest community: Future Pulse. Future Pulse, formerly Morning eHealth, will be delivered to your inbox every Wednesday. Interested in a more granular eHealth coverage? Get in touch to learn more about POLITICO Pro.
The coronavirus pandemic is upending how we treat disease and protect public health, and looks set to accelerate the convergence of health care and technology. From virtual doctor visits to artificial intelligence, the pandemic is spurring short-term fixes that could bring lasting change to the U.S. medical system, but the innovation has also raised serious privacy concerns and revealed stark inequalities about access to care.
Future Pulse is a new weekly newsletter for policymakers, executives, activists and any readers who are interested in the rapidly changing world of health care and technology. We will call out fads from real advances, explore where experimentation is working, where it’s not and investigate the tension between innovation, regulation and privacy. Join the conversation!
Did someone forward you this newsletter? Sign up here to subscribe.
This week we want you to tell us if you trust your primary care provider to protect your personal information against hacking. Send replies to: [email protected] and [email protected].
USING VIDEO GAMES, AI TO SAVE YOUNG PEOPLE: Amid alarming levels of young people reporting suicidal thoughts, a suicide prevention nonprofit focused on LGBTQ youth is adding to its digital toolbox, POLITICO’s Mohana Ravindranath reports.
The Trevor Project is promoting its own crisis hotline on digital hoodies worn by characters in the popular video game Animal Crossing, using AI and natural language processing to prioritize high-risk callers, and advocating for telemental health that’s in high demand during the pandemic. It’s all part of an effort to reach young people who can’t afford traditional mental health services or feel ignored by health care providers, said Myeshia Price-Feeney, a research scientist for Trevor Project.
CDC data shows that a quarter of young people have contemplated suicide during the pandemic, and these times have been especially challenging for marginalized groups. Forty-four percent of Black LGBTQ youth have seriously considered suicide in the past year, including 59 percent of those identifying as transgender and nonbinary, according to a new Trevor Project report. Price-Feeney said there’s a real need for mental health app developers to design products tailored for these at-risk populations.
“You can’t just broadly apply something and expect it to work for everyone,” she said.
TELEHEALTH DIDN’T FILL PRIMARY CARE GAP: Health care watchers have been agog at the surge in telehealth visits during the pandemic — but technology hasn’t filled the gaps left by missed primary care visits, a study in JAMA’s Open shows.
The analysis of claims data shows a 21.4 percent drop in primary care visits during the second quarter of 2020, even after accounting for the steep rise in telehealth appointments, which accounted for just over a third of all visits during that period.
The examinations during these visits differed significantly. Almost 70 percent of office-based visits in the second quarter of 2020 featured a blood-pressure check; just 10 percent of telemedicine visits did.
Researchers didn’t find any evidence bearing out concerns that a surge of telemedicine would result in the overprescribing of opioids, antibiotics and other drugs. The data also showed virtual visits by Black patients rose about the same as white patients — possible comfort to those concerned about widening health disparities.
Duke is using anthropologists and sociologists to incorporate artificial intelligence into its work, STAT News reports.
Government needs to do much better in fostering trust for contact tracing, declares a Med Page Today op-ed.
Cargo airlines are plotting vaccine distribution, writes the Wall Street Journal.