FDA plan for reviewing AI-enabled medical devices isn't quite computing

"I think the FDA is struggling to get its arms around how to regulate it," said Kyle Faget, a partner at Foley & Lardner. "We're really in the nascent stage with AI."

Credit: Artwell/Adobe Stock

Makers of medical devices that use artificial intelligence are clamoring for clarity from the Food and Drug Administration about its approval framework for devices evolving in real time.

Manufacturers applauded the FDA in April when it issued draft guidance known, in short, as the Predetermined Change Control Plan for AI. A PCCP would allow device makers in the initial premarket submission of a product to essentially prespecify future capabilities of a device without resubmitting it later for approval.

Under the traditional regulatory framework, manufacturers of machine learning devices might have to make another marketing submission for a safety and effectiveness determination, a potential costly and time-consuming process.

“FDA’s current regulatory framework works well for medical devices that undergo iterative improvement over time or serially, but not so much for devices that are evolving/iterating in real time as is the case with machine learning devices,” said Kyle Faget, a partner at Foley & Lardner and co-chair of the firm’s health care practice group.

The current process “is not well suited for the faster iterative design and development, and type of validation used for software device functions,” Foley & Lardner said in a client advisory.

But the PCCP process still has device makers scratching their heads, based on comments they’ve filed over the last several months about the draft guidance.

For example, the Combination Products Coalition, in a filing with the FDA, said the guidance lacks specifics “in how the agency expects to implement practices for drug-device and biologic-device combination product and software used in combination with medicines.”

Meanwhile, the Medical Device Manufacturers Association, while also welcoming the guidance, asked the agency to clarify circumstances in which PCCP may be used beyond AI/machine-learning-enabled software functions, such as signal processing algorithms.

Stakeholders are saying, “Hey, we want greater definition here because we want to understand what kind of flexibility that we have,” Faget said. “The hard part is there is still so much unknown about AI.”

For example, while AI could potentially take over some routine tasks that clinicians perform, allowing AI to stand in for certain clinical decision-making raises all kinds of issues, including the confidence level of such technology, Faget added.

“I think the FDA is struggling to get its arms around how to regulate it. … We’re really in the nascent stage with AI.”

Even with uncertainties about the technology and the FDA’s traditional, onerous approval regimen, the agency has approved or cleared more than 600 AI-machine-learning devices so far. Last month it added another 171 devices to that list.

The vast majority (79%) of devices authorized through July were in radiology. That’s followed by 9% in cardiovascular, 5% in neurology and 4% in gastroenterology/urology.

The agency said none of the approved devices authorized to date uses generative or artificial generative intelligence, forms of AI that rely on very large datasets and theoretically more capable in such areas as predictive abilities.

But the FDA has “already experimented” with PCCPs as part of some marketing authorizations of AI devices in recent years, Ropes & Gray noted in a client advisory.

One of these was an authorization of a software-only device by Caption Health that uses AI to emulate the expertise of a sonographer, providing guidance and feedback to the operator for improving the quality of cardiac ultrasound images. The PCCP allows for certain future algorithm improvements.

The iterative potential for AI could result in rapid evolutions of a medical device, such that “a year and a half later you’re looking at a completely different device,” Faget said.

As such, some of her clients are “super-frustrated” by existing regulatory hurdles.

“That’s very frustrating for innovators because it takes time to go through the regulatory process. My clients are pushing the boundaries all of the time,” she said.

“I have to say, ‘That’s fine, but we’re going to have to approach the FDA about it,’” she added.

For the near term, however, she predicts that the FDA using PCCPs will say yes to some lower-risk products and see how they perform in the marketplace. “I think the FDA is saying we need better sight lines into these particular products,” she said.

Read more: What’s behind the drug shortages/? Lawmakers seek answers from the FDA

A 2021 research paper by Harvard and University of Pennsylvania researchers noted that some of the FDA-approved devices with AI-machine-learning algorithms outperform physician assessment in clinical use.

However, “despite their promise, AI/ML algorithms have come under scrutiny” for inconsistent performance. ”Algorithm inaccuracy may lead to suboptimal clinical decision-making and adverse patient outcomes. These errors raise concerns over liability for patient injury,” the researchers said.

That potential liability trail extends beyond physicians, to health systems and device manufacturers.

“Increasing liability for use or development of algorithms may disincentivize developers and health system leaders from introducing them into clinical practice,” the researchers wrote.