7 Life-Saving AI Use Cases in Healthcare

From early disease detection through enhanced medical decision-making to better patient outcomes—here's how AI technology is transforming the healthcare industry.

Can AI transform our healthcare system?

The short answer is: Yes. 

But—

The question we should be really asking ourselves here is: How?

How can the healthcare industry benefit from the use of artificial intelligence? 

Areas of impact for AI in healthcare

Multiple research studies suggest that:

  • Over 93% of practitioners agree that ML-powered processes could help resurface hidden or unobtainable value in the healthcare industry. (Accenture)
  • AI can achieve better patient outcomes, enhance care delivery, and drive operational gains for the healthcare industry. (EIT Health and McKinsey)
  • Over 69% of healthcare companies are piloting or adopting AI solutions for administrative assistance to preliminary diagnosis (Accenture)

From enhanced care delivery through AI-assisted diagnostics to medical records management, AI technology is revolutionizing modern healthcare using machines that can learn, understand, predict, and act.
In this article, we’ll take a look at seven different use cases of artificial intelligence in healthcare:

  1. AI-Assisted Chest X-Ray Analysis for Pulmonary Diseases 
  2. Dermatology Scans for Melanoma
  3. Advanced CT and MRI Scan Analysis 
  4. AI-Assisted Breast Cancer Detection 
  5. Artificial Intelligence for Digital Pathology
  6. Automation of Administrative Tasks in Healthcare
  7. AI for Ophthalmology
  8. The Future
💡 Pro tip: Looking for a medical image and video annotation tool? Check out our Medical Image Annotation with V7 guide.

1. AI-Assisted Chest X-Ray Analysis for Pulmonary Diseases 

AI chest x-ray annotation analysis

X-ray and computer tomography (CT) are the two trusted means for diagnosing pulmonary diseases. 

Such medical imagery provides doctors with life-saving insights which result in better health outcomes.

However—

It takes a great deal of experience to interpret the images properly. 

When the patient volumes are high and the timing is critical, doctors need reliable tools that can help them make accurate predictions and act immediately.

That’s where artificial intelligence applications come in handy.

State-of-the-art image recognition algorithms can pick up patterns in X-ray images that are too subtle for the human eye. The machine learning technologies also work at a faster pace — and assist with patient classification and triage. 

COVID-19 

A group of researchers recently came up with an artificial intelligence solution for screening COVID-19 patients based on their lung X-rays. 

The system uses principal component analysis (PCA) and clustering techniques to distinguish between L and H phenotypes of COVID-19.

COVID 19 clustering in frontal x-rays
(Clustering of COVID-19 patients from frontal (PA) x-ray images. Source: Medrxiv)


This AI tool could help healthcare professionals make better treatment decisions: determine whether the patient needs mechanical ventilation to survive (H phenotype) or if applying this treatment could cause lung injury (L phenotype). 

Another group of medical researchers developed an automated image classification system based on Microsoft CustomVision. 

The AI solution also helps classify different types of pneumonia (COVID-19 and from other etiologies). 

When tested on a patient data sample, the model performed with 92.9% sensitivity (recall) and correctly identified:

Patient sample in covid-19 AI model performance


Pretty impressive, right?

It’s only a matter of time when more comprehensive patient data will help to advance AI solutions for diagnosing pulmonary diseases.

And speaking of patient data...

We’ve been working on that, too :)

💡 Pro tip: You can check out our free dataset with 6000+ annotated X-ray lung images here.

Lung Cancer 

Using AI has proven to be effective in identifying previously undetected cancers on chest X-rays.

Clinical researchers from the Seoul National University Hospital found that a commercially available AI application outperformed a team of four thoracic radiologists on first and second reads. 

The tested artificial intelligence algorithm:

  • Produced higher specificity than the radiologists during first reads.
  • Helped the team members better interpret the results during the second read. 

The experiment results are promising, but—

Medical professionals still doubt that AI algorithms could produce consistent results in clinical settings when “fed” with larger sample data. 

Indeed, the problem of bringing machine learning technology to production is a pressing one for healthcare companies. 

Many require modernization of the current software systems before they’d become capable of running advanced machine learning systems. Others need more mature data management and data governance practices to ensure the utmost security of the processed imagery and other patient records.

Radiology x-ray windowing annotation
"Windowing" allows humans and machines to see details in X-rays beyond the range of colours that our monitors enable us to see.

2. Dermatology Scans for Melanoma

melanoma detection ai computer vision annotation

Skin cancer is one of the most common types of all cancers. 

Melanoma may be the rarest subtype, but it will be responsible for over 7000 deaths this year, in the United States alone.

Here’s the better news—

Up to 86% of melanoma skin cancer types are preventable and well-treated if detected at the early stage. As you've probably guessed, artificial intelligence and machine learning can be of help.

Yet, the devil is in the details. 

Early-stage melanoma is difficult to distinguish from benign moles and other malignant neoplasms. And that is why doctors are experimenting with AI technology to gain extra decision support. 

The International Skin Imaging Collaboration (ISIC) has been a driving force in creating annotated datasets of skin lesion images to accelerate the quality of computer-aided research. 

Commoditized access to cloud computing technologies and active knowledge sharing among machine learning professionals also helped significantly accelerate skin cancer research over the past five years. 

Here are some encouraging findings:

In 2019, researchers used a convolutional neural network (CNN), built on InceptionV3 and ResNet50 architectures to analyze close-up and dermoscopic medical imaging datasets. 

The goal was to determine non-pigmented skin cancers. 

After training the model, they compared how artificial intelligence performed against 95 dermatologists. 

What was the result?

The AI delivered the same levels of accuracy as human experts. 

The same levels of accuracy.

Let that sink in for a second. 

And that’s not all—the tool also performed better than beginner- and intermediate-level dermatologists. 


Source: US National Library of Medicine


Another scientific group pitted InceptionV4 (a deep learning architecture approved for medical purposes by the European Union) against a group of human dermatologists.

Both were asked to diagnose 100 test cases (including 60 benign and 40 malignant skin lesions) by first reviewing a dermoscopic image and then additional clinical close-up images and clinical information. 

Here are the results (with a dermoscopic image only):

Dermoscopic image melanoma detection AI model


With more healthcare information available, the mean sensitivity of humans increased to 94.1%, but the mean specificity stayed the same. However, the CNN system also did slightly better than doctors this time, scoring 95% sensitivity. Yet, more experienced dermatologists did better than an algorithm in many cases. 

3. CT and MRI Scan Analysis with AI and Machine Learning

mri-ct-image-annotation brain

Now, let’s see how AI can benefit radiology which has long been at the forefront of innovations in the healthcare industry. 

This segment is ahead of the adoption curve when it comes to artificial intelligence and machine learning use.

With medical imaging annotation accessibility and quality improving, data scientists in this domain can further enhance the performance levels of the earlier produced artificial intelligence technology. 

What’s more, such tools support better decision-making and help doctors win back some precious time, typically spent on image analysis, instead of clinical and laboratory contexts. 

💡 At present, AI technologies can help extract over 1,500 data points from CT, MRI, and PET scans to help healthcare professionals perform more accurate diagnoses.

Some of the emerging AI use cases in this category include: 

  • CT scans for pneumonia: A group of Chinese researchers developed a test AI system for analyzing radiology images for early signs of  COVID-19 pneumonia. The tool is said to save about 30%-40% of the detection time for doctors. So that the professional could identify, isolate, and treat contagious patients faster.  
  • Quality augmentation of MRI brain scans: A scientific group from Spain developed a deep learning algorithm for increasing the resolution of MRI images. Doing so helps healthcare professionals identify complex brain-related pathologies, including cancer, speech disorders, and physical trauma. 
  • Accelerated production of MRI images:  Facebook AI and NYU Langone Health have been working on a joint project called fastMRI. It offers a novel approach to MRI image creation using AI, which makes the scanning process 4X faster. When such images were given to radiologists, they couldn’t distinguish between the traditional and AI-generated scans. 

To date, the FDA (Food and Drug Administration), has approved over a dozen of deep learning methods for usage in radiology. 

Globally, many more AI pilots are moving from the labs to hospital wards. 

an MRI scan of the pelvis with image annotation

4. AI-Assisted Breast Cancer Detection 

mammography detection AI in DICOM image annotation

Here’s the fact: Regular screenings have proven to significantly improve breast cancer survival rates.

Early detection and timely treatments have high efficacy rates. However, the wider introduction of breast cancer screening programs has also placed a higher toll on radiologists. 

For instance—

The European Union Guidelines for breast screening set the aim to invite up to 70%-75 % of eligible women in member states to such programs. With a significant increase in patient volumes, healthcare professionals require smarter support for decision-making. Emerging artificial intelligence solutions proved to deliver such guidance in clinical settings. 

A breast clinic at Radboud University Medical Center, on the other hand, recently adopted a tomosynthesis tool to obtain 3D images of the breast, offering a greater level of digitalization for diagnostics. 

But—

The trade-off of using such computer-generated images is longer reading time. 

Thus, many specialists now rely on artificial intelligence software to facilitate analysis. These tools can process images faster and provide valuable insights for decision-making, empowering radiologists to make evaluations up to 15-20% faster

Here’s yet another example.

A recent collaboration between Google Health and physicians yielded very promising results. 

The group is working on a supporting tool for mammography readings. When put to the test, the AI algorithm was as accurate as a serial reading done by two doctors. 

It was the speed, however, that was a significant differentiator—the system reduced the workload of the second reader by 88%. 

But speed and accuracy alone won’t solve the bigger problem we are facing, namely—

The rapid increase in the expected shortage of radiologists across the globe.

In fact, the national shortage in the UK already approximates 27-37%, depending on the region. The Association of American Medical Colleges estimated that nearly 42,000 radiology vacancies will remain unfulfilled by 2033. 

In developing regions, access to professionals and equipment is even direr. AI systems can reduce the pressure on the healthcare systems and facilitate scaling breast screening and cancer detection programs. 

5. Artificial Intelligence for Digital Pathology

artificial intelligence image annotation in digital pathology

Digital pathology is an emerging sub-division of conventional microscopy, enabling practitioners to virtualize glass pathology slides for more in-depth analysis. 

Artificial intelligence algorithms can assist pathologists with:

  • Image analysis and interpretation 
  • Detailed inspections of sample tissues 
  • Pathology types matching to earlier cases
  • Diagnosis accuracy and early detection 

A group of cancer researchers recently analyzed a public database of WSIs from 11,000 cancer patients, featuring 32 cancer subtypes. The trained algorithm leveraged image data and annotations to reach a “computational consensus” on the type of pathology on display. 

The tool identified different types of pathology on frozen section slides with high accuracy:

  • 93% for bladder urothelial carcinoma 
  • 97% for kidney renal clear cell carcinoma
  • 99% for ovarian serous cystadenocarcinoma

And performed equally accurately with histopathology slides:

  • 98% for prostate adenocarcinoma
  • 99% for skin cutaneous melanoma
  • 100% for thymoma
Results for frozen sections (top) and permanent diagnostic slides (bottom)


digital pathology AI diagnostics results
Source: Nature 


Similar artificial intelligence tools can support clinical decision-making and enhance the diagnosis of lesser studied pathologies and early-stage variations. 

6. Automation of Administrative Tasks in Healthcare using Natural Language Processing

Automatically processed handwritten prescription with AI

So, how about using AI for handling clinical documentation, clinical decision support system, and other electronic health records?

Well, first of all—

The application of AI technology and machine learning to administrative healthcare tasks can help healthcare providers trim operational costs by a whopping $16.3 billion

Modern artificial intelligence algorithms already saved over $122 billion for early adopters of process automation solutions. 

Machine learning and artificial intelligence can be used to automate: 

  • Prior authorizations (still largely faxed-based) 
  • Healthcare claims management 
  • Patient records management 
  • Appointment booking, scheduling, and management 
  • Clinical records creation and processing 

Secondly, natural language processing (NLP), in particular, has proved to drive major productivity gains. 

Healthcare providers deal with vast volumes of structured and unstructured patient data that need to be captured in EHR/EMR systems. 

The problem, however, is that up to 80% of health data is unstructured, rendering traditional text mining algorithms ineffective. Respectively, a lot of big data in healthcare remains siloed and utilized. 

This negatively impacts patient outcomes. 

NLP systems can help providers process more information faster, including SDOH data (social determinants of health). 

Healthcare professionals say that this data contains important non-clinical insights that impact patients’ well-being and care delivery. But resurfacing it for analysis as it remains siloed across different mediums — spoken notes, patient portals, email exchanges, and EHR systems. 

“Medical care is estimated to account for approximately 20% of healthcare outcomes, which is an important reason why healthcare leaders should consider SDOH factors and their impact on patient health.”

says Dr. Elizabeth Marshall in an interview with Health Tech IT. 

According to Marshal, natural language processing algorithms can help access such data and translate it into insights for diagnosis and wider healthcare initiatives. These include care programs, better patient resources, and targeted interventions. 

Apart from improving patient outcomes, natural language processing can also help hospitals recoup some operational costs. 

For instance—

Concord hospital has recently adopted Nuance’s AI-powered software suite for speech recognition and real-time clinical records processing. The staff can now dictate notes from any workstation or a personal smartphone, and the data will be auto-transcribed and added to a respective patient file. 

The phone-based transcription usage was reduced by 91%, and the instruction recorded a saving of $1 million

NLP-powered applications can also improve patient care. 

Intelligent virtual assistants can converse with patients to collect symptom data while waiting and help identify more critical patients for immediate admissions. 

In addition, voice assistants can be more convenient for patients with limited mobility who require a quick consultation or adjustment in their room settings when hospitalized. 

7. AI for Ophthalmology

OCT analysis AI annotation
Retinal layers visualized on V7 for OCT image labelling used in early-stage glaucoma detection

Lastly, let’s see how artificial intelligence can be used in ophthalmology.

Research shows that the healthcare systems are experiencing a deeper strain in providing timely ophthalmology care. Due to rapidly aging populations, the public healthcare systems cannot always guarantee timely access to specialists. 

This can lead to poor patient outcomes. 

Delays in ophthalmology treatment can lead to irreversible harm such as visual field loss and visual acuity deterioration that could have been prevented with earlier diagnosis. 

Other types of eye diseases such as diabetic retinopathy (DR), cataract, glaucoma, age-related macular degeneration (AMD) also require timely intervention. 

Emerging AI and deep learning models use cases attempt to compensate for gaps in diagnosis and accelerate the providers’ ability to perform population-level screenings. 

  • Diabetic retinopathy detection: In 2018, FDA approved the first AI-powered system for autonomously diagnosing diabetic retinopathy. The IDx-DR system captures eye shots with a fundus camera and performs analysis in under a minute. The clinical trials demonstrated an 87% sensitivity and 90% specificity in detecting early signs of DR among patients. 
  • Glaucoma detection: A study conducted in 2020 also notes that AI methods can improve glaucoma diagnostic procedures. An algorithm trained on OCT images and fundus photography proved to be more effective for detecting glaucoma progression earlier while using data from a single VF test.
  • Cataract treatment management via telehealth: A group of Chinese researchers proposed a novel system for remote cataract progression monitoring and intervention. They used deep learning algorithms to create a smartphone-based AI assistant that patients can use for check-ins. The algorithm can analyze the changes in cataract progression, make personalized predictions, and help schedule timely visits and checkups.


Source: Nature 

The Future of AI In Healthcare

The growing availability of annotated healthcare datasets and medical imaging serves as a springboard from which new artificial intelligence use cases emerge. 

Most emerging technologies are yet to be tested in wider clinical settings. Yet—

AI in healthcare has a bright future ahead.

Artificial intelligence, machine learning, and deep learning have proven useful in helping both healthcare providers and patients. Apart from The current (and future) applications include:

  • Revenue cycle management 
  • Robotic process automation
  • Drug development processes
  • Clinical documentation management
  • Claims processing
  • Treatment applications diagnosis

It can also analyze data on patient visits to the clinic, medications prescribed, lab tests, and procedures performed in the past.

Theoretical results and early pilots instill great confidence in computer science professionals and doctors joining forces to transform patient care and improve healthcare organizations across the globe.

However—

For AI in healthcare to go mainstream, wider healthcare sector transformations are required, too. Hospitals will have to invest in new technology systems and staff upskilling. They will need to undertake substantial integration projects, and focus their efforts on the adoption of those technologies in their daily clinical practice.

Tomas Laurinavicius
Guest Author

Tomas is an entrepreneur, designer, and technology blogger from Lithuania. He developed a readership of over 1 million on his personal blog and writes for the Huffington Post and Forbes.

Related posts

Upgrade to a new era of software

We're telling the stories of teams that pioneer neural networks to solve any visual task. You can join them by signing up to V7 - the only platform to develop AIs for aony computer vision use case, and monitor them in production.You'll be able to develop your own training data and models, or apply pre-existing AI models to solve new use cases.

Learn about V7

Ready to get started?

Schedule a demo with our team or discuss your project.

Dataset Management

AutoML model training to solve visual tasks or auto-label your datasets, and a scalable inference engine to launch your project.