Logo

Log In Sign Up |  An official publication of: American College of Emergency Physicians
Navigation
  • Home
  • Multimedia
    • Podcasts
    • Videos
  • Clinical
    • Airway Managment
    • Case Reports
    • Critical Care
    • Guidelines
    • Imaging & Ultrasound
    • Pain & Palliative Care
    • Pediatrics
    • Resuscitation
    • Trauma & Injury
  • Resource Centers
    • mTBI Resource Center
  • Career
    • Practice Management
      • Benchmarking
      • Reimbursement & Coding
      • Care Team
      • Legal
      • Operations
      • Quality & Safety
    • Awards
    • Certification
    • Compensation
    • Early Career
    • Education
    • Leadership
    • Profiles
    • Retirement
    • Work-Life Balance
  • Columns
    • ACEP4U
    • Airway
    • Benchmarking
    • Brief19
    • By the Numbers
    • Coding Wizard
    • EM Cases
    • End of the Rainbow
    • Equity Equation
    • FACEPs in the Crowd
    • Forensic Facts
    • From the College
    • Images in EM
    • Kids Korner
    • Medicolegal Mind
    • Opinion
      • Break Room
      • New Spin
      • Pro-Con
    • Pearls From EM Literature
    • Policy Rx
    • Practice Changers
    • Problem Solvers
    • Residency Spotlight
    • Resident Voice
    • Skeptics’ Guide to Emergency Medicine
    • Sound Advice
    • Special OPs
    • Toxicology Q&A
    • WorldTravelERs
  • Resources
    • ACEP.org
    • ACEP Knowledge Quiz
    • Issue Archives
    • CME Now
    • Annual Scientific Assembly
      • ACEP14
      • ACEP15
      • ACEP16
      • ACEP17
      • ACEP18
      • ACEP19
    • Annals of Emergency Medicine
    • JACEP Open
    • Emergency Medicine Foundation
  • About
    • Our Mission
    • Medical Editor in Chief
    • Editorial Advisory Board
    • Awards
    • Authors
    • Article Submission
    • Contact Us
    • Advertise
    • Subscribe
    • Privacy Policy
    • Copyright Information

The AI Legal Trap in Medicine

By Ryan Patrick Radecki, MD, MS | on August 14, 2025 | 0 Comment
Pearls From the Medical Literature
  • Tweet
  • Click to email a link to a friend (Opens in new window) Email
Print-Friendly Version

In each of these cases, the laypersons were provided with testimony from hypothetical human experts for both the plaintiff and the defense and asked to provide a “baseline” judgment on liability for the radiologist. Then, the laypersons were provided with the recommendations provided to the radiologist by an AI system and surveyed for any subsequent effect on liability. The permutations included were scenarios in which the AI found the missed diagnosis but were overruled by the radiologists, and scenarios in which the AI missed the diagnosis, as well. There were also permutations in which the accuracy level of the AI was provided.

You Might Also Like
  • ACEP Member Medical-Legal Survey Results
  • ACEP Fighting for Medical Liability Reform
  • New Mexico Supreme Court Ruling Helps Ensure Patient Care Across State Lines
Explore This Issue
ACEP Now: September 2025

The hypothetical radiologist was judged to be at fault by the surveyed laypersons at about 60 percent in both cases. Having an AI that also missed the diagnosis was protective. Additionally, the level of protection was further increased if the AI was described as very sensitive, missing only one in 100 cases in real-world use. The flip side, however, was that disagreement with the AI increased the perceived liability of the radiologist. This was somewhat ameliorated by adding information to point out the AI was not very specific, with a high false-positive rate of 50 percent. Overall, the message is clear, within the limitations of their survey and sampled population: Disagreeing with the AI is definitely riskier than agreeing with it.

It’s unclear whether perceptions of AI will change as the public becomes more familiar with its abilities and limitations, or as AI accuracy advances. Regardless, as strong as the AI diagnostic capabilities become, there is little indication the ultimate responsibility will lay elsewhere than with the treating clinicians. It will require an extra level of care to navigate the recommendations provided by AI and software algorithms, particularly when overruling its conclusions.


Dr. RadeckiDr. Radecki (@EMLITOFNOTE) is an emergency physician and informatician with Christchurch Hospital in Christchurch, New Zealand. He is the Annals of Emergency Medicine podcast co-host and Journal Club editor. 

 

References

  1. Mello MM, Guha N. Understanding liability risk from using healthcare artificial intelligence tools. N Engl J Med. 2024;390 (3):271-278.
  2. Bernstein MH, Sheppard B, Bruno MA, et al. Randomized study of the impact of ai on perceived legal liability for radiologists. NEJM AI. 2025;2(6).

Pages: 1 2 3 | Single Page

Topics: AIClinical Decision ToolsLegalLiabilityMalpracticePatient SafetyRisk

Related

  • Q&A with ACEP President L. Anthony Cirillo

    November 5, 2025 - 0 Comment
  • How Evidence-Based Medicine Strengthens Your Malpractice Defense

    October 28, 2025 - 0 Comment
  • Overcoming Language Barriers in the Emergency Department

    October 21, 2025 - 0 Comment

Current Issue

ACEP Now: November 2025

Download PDF

Read More

No Responses to “The AI Legal Trap in Medicine”

Leave a Reply Cancel Reply

Your email address will not be published. Required fields are marked *


*
*


Wiley
  • Home
  • About Us
  • Contact Us
  • Privacy
  • Terms of Use
  • Advertise
  • Cookie Preferences
Copyright © 2025 by John Wiley & Sons, Inc. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies. ISSN 2333-2603