AI Matchmaking in Mammography: A Radiologist’s Guide to Quality and Safety

AI Matchmaking in Mammography: A Radiologist’s Guide to Quality and Safety

Being a radiologist in the age of AI devices is like riding a rollercoaster blindfolded. It’s a wild ride, and sometimes you wonder who’s steering the ship. It’s the same feeling echoed, whether it’s a radiologist at a large enterprise or private practice. There is a dearth of AI device options available to the end-users (radiologists) while non-medical professionals are reviewing numerous algorithms and making decisions for the clinicians, based on the available market info. In AI’s rapidly advancing world, ensuring the highest quality and safety standards is paramount. There are more AI devices in radiology than in any other medical specialty. The sources for quality standards are diverse, spanning FDA, ACR, peer-reviewed literature, and word of mouth. For radiologists, being informed about regulatory standards is not only about providing better patient care but also about making strategic buying and adopting choices. 

Having discussed how quality can be defined for your institution, in this blog post, we’ll delve into how various stakeholders design quality and safety standards for the industry and how to ensure that you, as a radiologist, are equipped with the knowledge and tools necessary to provide the highest standard of patient care amidst the challenges and opportunities presented by the emergence and adoption of AI in breast imaging.

ACR’s Legacy 

The ACR mammography guidelines encompass everything from determining when breast imaging is appropriate to evaluating the quality of mammographic images, ensuring that patients receive the highest standard of care. For example, through extensive scientific analysis and literature review, the ACR set up the BI-RADS classification system, to standardize risk assessment and quality control for mammography and provide uniformity in the reports for the non-radiologist. Other analogous systems, such as the LIRADs for lung and TIRADs for thyroid, have also made their way into being adopted worldwide. 

Today, no AI device can gain market traction without adhering to the quality standards established within the industry. These quality standards serve as the common language that ensures consistency and reliability in radiological practices. As a community of your peers in radiology, the ACR has the power to influence what the market, made up of radiologists as the end-users, demands. This influence, in turn, shapes the industry. 

Take ACR’s dosimetry efforts as another example. Through a centralized platform, the Dose Index Registry, a form of peer pressure, was introduced, which guided imaging facilities to align with established norms or at least, identify where they were falling short. Currently, the FDA has approved several AI devices as “software as medical devices” (SaMD) but lacks any standardized elements for every AI device clearance proposal that comes through. 

Having a centralized database of the available tools, with standard form elements, curated through actual peer-submitted data from their performance in the market, can guide both developers and end-users better for the demand/use case. This, in turn, can prevent over-dilution of the market with several AI devices that perform the same function. 

The Role of Orchestrators and Platforms

There is a need to assess which tools are best suited for a particular imaging site, based on the availability of resources such as facility infrastructure, scanners, patient demographics, and population density.

One option is to develop multiple in-house devices tailored to the site’s needs. This would help bypass FDA authority/registration requirements since it can’t regulate the practice of medicine, which currently includes AI-enabled software developed in-house and not being sold/utilized elsewhere. Adopting AI in breast imaging into legacy industries is not easy; you will spend a long time and incur significant expense to develop your own algorithm as well as the host platform and to integrate it into the workflow system. The enterprise also will be responsible for the ongoing expenses of maintaining the system.

The other option may be to set up multiple contracts with several vendors, serving different purposes and picking the best performing AI device or a combination of devices. However, all involved parties, including the radiologists and IT professionals, would face challenges in managing these multiple tools, contracts, overhead, and vendors. Instead of alleviating physician burnout and improving productivity, this entire adoption and deployment process approach becomes counterproductive. Orchestrating platforms? Picture them as the matchmakers in the AI dating game. They know what’s a good product on the market and how to partner with the institution or radiologists to find a perfect match for their specific situation. 

Platforms act as centralized information brokers, streamlining the complexities of deploying various AI tools. The orchestrating approach reduces the requirement to learn and adapt reading habits to new workflows.

They offer an incentive for developers to have their algorithms approved; being stamped with the seal of “transparency” or “clinically validated” creates a defined space for such platform-hosted devices in the market. In turn, this knowledge helps radiologists avoid expensive and unhelpful pitfalls and helps them make decisions that align with the evolving landscape. Through conversations with these platforms, small private practices can begin seeing what’s out there in the market; surely, they can’t follow in the footsteps of large enterprises, but they can define their own use cases and start taking an active role in the AI device adoption process. 

There is a far lower barrier to entry when developing an AI device that can perform “better” instead of solving a unique use case. However, this has led to the market being saturated with several products performing the same basic function. This is where true platforms, being vendor-neutral, can shape the industry to what the customers need instead of simply convincing stakeholders to invest/adopt for revenue production. Having slim, simple, and easy criteria for quality assurance or approval, coupled with peer pressure from either competitor vendors or end-user radiologists, ensures compliance from the AI device vendors; the alternative has the potential to turn the market against the vendor. 

Another area of current regulatory oversight is the AI lifecycle process. FDA-approved SaMDs are “locked” versions. The FDA has acknowledged the limitations of its current approval approach and explored the concept of a predetermined change control plan, allowing companies to anticipate and outline expected modifications to software-based devices. Given that the market is moving towards self-learning algorithms that perform in a “black box,” there is no way to anticipate these changes or outline the processes that provide the outcomes. Another attempt has been made with a digital health pre-certification program, which could enable the FDA to pre-clear reputable manufacturers for updating their software products, but this hasn’t moved past the pilot stage. However, through the use of AI device platforms, the continuous evolution and change in device performance can be periodically monitored in-house to ensure patient safety and optimal outcomes. Post-marketing surveillance data can be bidirectionally transferred with no risk to patient data privacy. 

At the end of the day, it’s about patient care. 

Radiologists’ collaborative approach through data-driven standards, and commitment to continuous improvement position them vitally in shaping the future of mammography through AI.

Siddhi Hegde

Siddhi Hegde

Research Fellow and aspiring radiologist exploring new technologies in patient care.

Contact Us

CASE STUDY

ARA Health Specialists

Use the button below to download your free case study and learn how our approach to validation has improved the number of clinically significant findings in AI software.

CASE STUDY

Sutter Health

Use the button below to download your free case study and learn how our approach to validation has improved the number of clinically significant findings in AI software.