Flawed Software
Contrasting with this is the case of Lowe v. Cerner Corporation. In this instance, a patient had been hospitalized for gallbladder surgery. After surgery, the surgeon placed an order in the electronic health record for continuous pulse oximetry because of the patient’s baseline chronic respiratory conditions. Because of a known defect in the computerized physician order entry system, the order for pulse oximetry was forward-dated to next morning and was not conveyed to the postoperative care team. During the intervening unmonitored period, the patient suffered a respiratory arrest and a hypoxic brain injury.
Explore This Issue
ACEP Now: September 2025Differing from the HeartWise case, the courts ultimately allowed this case against the software vendor to proceed. Crucially, in this instance, the argument was made regarding a specific negligent design within the software product itself. This provides some cover for the surgeon, whose legal team argued that the problem lay in a flaw in the software because it wasn’t up to “industry standard.” Further, the flaw was known to the company, allowing the case to move forward under a “failure to warn” line of reasoning.
It should be noted that both cases began in District Court, but had judgments reversed in Circuit and Supreme Court, illustrating the complexity of interpreting the underlying issues. The underlying point, however, is the demonstration of the very narrow instance in which the liability can be shifted from the clinician to the software product itself.
Perceived Liability
Unfortunately, the burden on the clinician in this modern age is further complicated by the potential perceptions of any software algorithm or AI by a jury at trial. The aforementioned illustrative cases involve distribution of potential liability, but this only leads to a subsequent question regarding the effect of AI on liability decisions. An interesting research study, published in NEJM AI, looks at these issues in the context of radiology, one of the specialties at the forefront of AI augmentation.2
In this study, surveyed laypersons were asked to provide their opinion regarding legal liability for two hypothetical cases. In the first case, a hypothetical radiologist was the defendant in a stroke case in which an intracranial hemorrhage was missed. As part of the hypothetical patient’s care, intravenous thrombolytics were administered. Because of the missed hemorrhage, the patient suffered hematoma expansion and subsequent brain injury. In the second case, a radiologist failed to detect an abnormality suspicious for lung cancer. As a result of the delayed diagnosis, the patient suffered premature death because of lost opportunity for treatment.
Pages: 1 2 3 | Single Page




No Responses to “The AI Legal Trap in Medicine”