Artificial Intelligence (AI): The Future of Clinical Image Interpretation

Carl Witonsky

Written by Carl Witonsky, Managing Director

In 2017, there were more than 377 million clinical images taken (including MRI, CT, angiography, ultrasound, mammography, X-ray, etc.) in the United States. Most were read and interpreted by radiologists (34,000) and pathologists (18,000), with other specialists in cardiology and orthopedics increasingly reading and interpreting as well. Most of the digitized images are typically stored in an Electronic Medical Record directly or through a data link along with a dictated transcription that classifies the result as negative (normal healthy finding) or positive (the presence of abnormal findings, such as a bone fracture or a malignancy, in which case additional specifics such as location, length, width, structure and other abnormalities are included). If a tissue biopsy was subsequently taken, pathologists would interpret numerous individual glass slides (up to 60 for a breast cancer biopsy!) using a microscope. During the past decade, some hospitals have been digitizing their slides, but this has not been broadly adopted until AI became available.

Artificial Intelligence is a data driven analytical system that “learns” by image examinations matched against reported clinician findings. It refines its knowledge and accuracy by processing thousands upon thousands of images and existing findings until it becomes at least as good as the best clinicians’ interpretations. If we look at the Memorial Sloan Kettering (MSK) and Paige.AI joint development announcements (September 2018), that is exactly what they (and other large cancer treatment centers) are attempting to accomplish. The agreement between MSK and Paige.AI calls for Paige.AI to gain exclusive rights to access MSK’s 25 million digitized pathology slides as well as MSK’s intellectual property in the field of computational pathology. All digitized slides are accompanied by clinical annotation as well as anonymized genomic sequencing results. It might take a year or more and millions of dollars to establish a knowledge base of millions of known results, but make no mistake, compared to autonomous cars this is a relatively easy problem to solve.

The solution implementation might start with a class of equipment and disease, e. g, mammography and breast cancer, and run parallel findings between radiologists/pathologists and AI until there was a high confidence level in AI, but there is no question that the need for radiologists and pathologists reading digital images and glass slides will drop significantly as adoption proceeds. Development could be accelerated if large hospital systems with hundreds of thousands of stored digital images and results would agree to share data with other hospitals and each cover a specific disease, but that might be expecting more cooperation and sharing than hospital executives might be willing to give.

The implementation and adoption of AI in clinical imaging will primarily involve AI technologists working side by side with senior radiologists and pathologists. Hence the difficulties of adoption we witnessed in the roll-out of EMRs, as they affected every doctor, nurse and administrator in the hospital, will not be experienced. Over a decade there should be hundreds of billions of dollars in radiology/pathology interpretation costs savings from this one area of healthcare utilizing Artificial Intelligence.

Investors attending HIMSS this year should be looking for AI technology firms with healthcare specific capabilities that have working relationships/partnerships with large integrated healthcare delivery systems that have accumulated millions of digitized radiological images as well as tissue slides. AI portends to lead the next major IT healthcare revolution that can truly yield better outcomes at significantly lower costs.

Please contact Carl Witonsky ( to discuss Artificial Intelligence and learn how Falcon Capital Partners might be able to help.