7 Life-Saving AI Use Cases in Healthcare

From early disease detection through enhanced medical decision-making to better patient outcomes—here's how AI technology is transforming the healthcare industry.
Read time
8
min read  ·  
June 1, 2021
V7 healthcare AI image annotation

Can AI transform our healthcare system?

The short answer is: Yes. 

But—

The question we should be really asking ourselves here is: How?

How can the healthcare industry benefit from the use of artificial intelligence? 

Areas of impact for AI in healthcare
AI applications in healthcare

Multiple research studies suggest that:

  • Over 93% of practitioners agree that ML-powered processes could help resurface hidden or unobtainable value in the healthcare industry. (Accenture)
  • AI can achieve better patient outcomes, enhance care delivery, and drive operational gains for the healthcare industry. (EIT Health and McKinsey)
  • Over 69% of healthcare companies are piloting or adopting AI solutions for administrative assistance to preliminary diagnosis (Accenture)

From enhanced care delivery through AI-assisted diagnostics to medical records management, AI technology is revolutionizing modern healthcare using machines that can learn, understand, predict, and act.

Build better healthcare solutions with AI

Annotate medical datasets and process patient records at scale


💡 Pro tip: Looking for a medical image and video annotation tool? Check out our Medical Image Annotation with V7 guide.

1. AI-Assisted Chest X-Ray Analysis for Pulmonary Diseases 

X-ray and computer tomography (CT) are the two trusted means for diagnosing pulmonary diseases. 

Such medical imagery provides doctors with life-saving insights which result in better health outcomes.

AI chest x-ray annotation analysis

However—

It takes a great deal of experience to interpret the images properly. 

When the patient volumes are high and the timing is critical, doctors need reliable tools that can help them make accurate predictions and act immediately.

That’s where artificial intelligence applications come in handy.

State-of-the-art image recognition algorithms can pick up patterns in X-ray images that are too subtle for the human eye. The machine learning technologies also work at a faster pace—and assist with patient classification and triage. 

💡 Pro tip: Check out The Ultimate Guide to Medical Image Annotation.

COVID-19 

A group of researchers recently came up with an artificial intelligence solution for screening COVID-19 patients based on their lung X-rays. 

The system uses principal component analysis (PCA) and clustering techniques to distinguish between L and H phenotypes of COVID-19.

COVID 19 clustering in frontal x-rays
(Clustering of COVID-19 patients from frontal (PA) x-ray images. Source: Medrxiv)

This AI tool could help healthcare professionals make better treatment decisions: determine whether the patient needs mechanical ventilation to survive (H phenotype) or if applying this treatment could cause lung injury (L phenotype). 

Another group of medical researchers developed an automated image classification system based on Microsoft CustomVision. 

The AI solution also helps classify different types of pneumonia (COVID-19 and from other etiologies). 

When tested on a patient data sample, the model performed with 92.9% sensitivity (recall) and correctly identified:

Patient sample in covid-19 AI model performance

Pretty impressive, right?

It’s only a matter of time when more comprehensive patient data will help to advance AI solutions for diagnosing pulmonary diseases.

And speaking of patient data...

We’ve been working on that, too :)

💡 Pro tip: You can check out 21+ Best Healthcare Datasets for Computer Vision.

Lung Cancer 

Using AI has proven to be effective in identifying previously undetected cancers on chest X-rays.

Clinical researchers from the Seoul National University Hospital found that a commercially available AI application outperformed a team of four thoracic radiologists on first and second reads. 

The tested artificial intelligence algorithm:

  • Produced higher specificity than the radiologists during first reads.
  • Helped the team members better interpret the results during the second read. 

The experiment results are promising, but—

Medical professionals still doubt that AI algorithms could produce consistent results in clinical settings when “fed” with larger sample data. 

Indeed, the problem of bringing machine learning technology to production is a pressing one for healthcare companies. 

Many require modernization of the current software systems before they’d become capable of running advanced machine learning systems. Others need more mature data management and data governance practices to ensure the utmost security of the processed imagery and other patient records.

Radiology x-ray windowing annotation
"Windowing" allows humans and machines to see details in X-rays beyond the range of colours that our monitors enable us to see.

2. Dermatology Scans for Melanoma

Skin cancer is one of the most common types of all cancers. 

Melanoma may be the rarest subtype, but it will be responsible for over 7000 deaths this year, in the United States alone.

Here’s the better news—

Up to 86% of melanoma skin cancer types are preventable and well-treated if detected at the early stage. As you've probably guessed, artificial intelligence and machine learning can be of help.

Yet, the devil is in the details. 

Early-stage melanoma is difficult to distinguish from benign moles and other malignant neoplasms. And that is why doctors are experimenting with AI technology to gain extra decision support. 

Melanoma detection ai computer vision annotation


The International Skin Imaging Collaboration (ISIC) has been a driving force in creating annotated datasets of skin lesion images to accelerate the quality of computer-aided research. 

Commoditized access to cloud computing technologies and active knowledge sharing among machine learning professionals also helped significantly accelerate skin cancer research over the past five years. 

Here are some encouraging findings:

In 2019, researchers used a convolutional neural network (CNN), built on InceptionV3 and ResNet50 architectures to analyze close-up and dermoscopic medical imaging datasets. 

The goal was to determine non-pigmented skin cancers. 

After training the model, they compared how artificial intelligence performed against 95 dermatologists. 

What was the result?

The AI delivered the same levels of accuracy as human experts. 

The same levels of accuracy.

Let that sink in for a second. 

And that’s not all—the tool also performed better than beginner- and intermediate-level dermatologists. 

AI accuracy vs. Experts accuracyin identifying non-pigmented skin cancers
Source: US National Library of Medicine


Another scientific group pitted InceptionV4 (a deep learning architecture approved for medical purposes by the European Union) against a group of human dermatologists.

Both were asked to diagnose 100 test cases (including 60 benign and 40 malignant skin lesions) by first reviewing a dermoscopic image and then additional clinical close-up images and clinical information. 

Here are the results (with a dermoscopic image only):

Dermoscopic image melanoma detection AI model


With more healthcare information available, the mean sensitivity of humans increased to 94.1%, but the mean specificity stayed the same. However, the CNN system also did slightly better than doctors this time, scoring 95% sensitivity. Yet, more experienced dermatologists did better than an algorithm in many cases. 

3. CT and MRI Scan Analysis with AI and Machine Learning

Now, let’s see how AI can benefit radiology which has long been at the forefront of innovations in the healthcare industry. 

This segment is ahead of the adoption curve when it comes to artificial intelligence and machine learning use.

With medical imaging annotation accessibility and quality improving, data scientists in this domain can further enhance the performance levels of the earlier produced artificial intelligence technology. 

mri-ct-image-annotation brain


What’s more, such tools support better decision-making and help doctors win back some precious time, typically spent on image analysis, instead of clinical and laboratory contexts. 

💡 At present, AI technologies can help extract over 1,500 data points from CT, MRI, and PET scans to help healthcare professionals perform more accurate diagnoses.

Some of the emerging AI use cases in this category include: 

  • CT scans for pneumonia: A group of Chinese researchers developed a test AI system for analyzing radiology images for early signs of  COVID-19 pneumonia. The tool is said to save about 30%-40% of the detection time for doctors. So that the professional could identify, isolate, and treat contagious patients faster.  
  • Quality augmentation of MRI brain scans: A scientific group from Spain developed a deep learning algorithm for increasing the resolution of MRI images. Doing so helps healthcare professionals identify complex brain-related pathologies, including cancer, speech disorders, and physical trauma. 
  • Accelerated production of MRI images:  Facebook AI and NYU Langone Health have been working on a joint project called fastMRI. It offers a novel approach to MRI image creation using AI, which makes the scanning process 4X faster. When such images were given to radiologists, they couldn’t distinguish between the traditional and AI-generated scans. 

To date, the FDA (Food and Drug Administration), has approved over a dozen of deep learning methods for usage in radiology. 

Globally, many more AI pilots are moving from the labs to hospital wards. 

An MRI scan of the pelvis with image annotation
💡 Pro tip: Check out 6 Innovative Artificial Intelligence Applications in Dentistry.

4. AI-Assisted Breast Cancer Detection 

Here’s the fact: Regular screenings have proven to significantly improve breast cancer survival rates.

Early detection and timely treatments have high efficacy rates. However, the wider introduction of breast cancer screening programs has also placed a higher toll on radiologists. 

mammography detection AI in DICOM image annotation


For instance—

The European Union Guidelines for breast screening set the aim to invite up to 70%-75 % of eligible women in member states to such programs. With a significant increase in patient volumes, healthcare professionals require smarter support for decision-making. Emerging artificial intelligence solutions proved to deliver such guidance in clinical settings. 

A breast clinic at Radboud University Medical Center, on the other hand, recently adopted a tomosynthesis tool to obtain 3D images of the breast, offering a greater level of digitalization for diagnostics. 

But—

The trade-off of using such computer-generated images is longer reading time. 

Thus, many specialists now rely on artificial intelligence software to facilitate analysis. These tools can process images faster and provide valuable insights for decision-making, empowering radiologists to make evaluations up to 15-20% faster

Here’s yet another example.

A recent collaboration between Google Health and physicians yielded very promising results. 

The group is working on a supporting tool for mammography readings. When put to the test, the AI algorithm was as accurate as a serial reading done by two doctors. 

It was the speed, however, that was a significant differentiator—the system reduced the workload of the second reader by 88%. 

But speed and accuracy alone won’t solve the bigger problem we are facing, namely—

The rapid increase in the expected shortage of radiologists across the globe.

In fact, the national shortage in the UK already approximates 27-37%, depending on the region. The Association of American Medical Colleges estimated that nearly 42,000 radiology vacancies will remain unfulfilled by 2033. 

In developing regions, access to professionals and equipment is even direr. AI systems can reduce the pressure on the healthcare systems and facilitate scaling breast screening and cancer detection programs. 

5. Artificial Intelligence for Digital Pathology

Digital pathology is an emerging sub-division of conventional microscopy, enabling practitioners to virtualize glass pathology slides for more in-depth analysis. 

Artificial intelligence algorithms can assist pathologists with:

  • Image analysis and interpretation 
  • Detailed inspections of sample tissues 
  • Pathology types matching to earlier cases
  • Diagnosis accuracy and early detection 
artificial intelligence image annotation in digital pathology

A group of cancer researchers recently analyzed a public database of WSIs from 11,000 cancer patients, featuring 32 cancer subtypes. The trained algorithm leveraged image data and annotations to reach a “computational consensus” on the type of pathology on display. 

The tool identified different types of pathology on frozen section slides with high accuracy:

  • 93% for bladder urothelial carcinoma 
  • 97% for kidney renal clear cell carcinoma
  • 99% for ovarian serous cystadenocarcinoma

And performed equally accurately with histopathology slides:

  • 98% for prostate adenocarcinoma
  • 99% for skin cutaneous melanoma
  • 100% for thymoma
Results for frozen sections (top) and permanent diagnostic slides (bottom)
Digital pathology AI diagnostics results
Source: Nature 

Similar artificial intelligence tools can support clinical decision-making and enhance the diagnosis of lesser studied pathologies and early-stage variations. 

6. Automation of Administrative Tasks in Healthcare using Natural Language Processing

So, how about using AI for handling clinical documentation, clinical decision support system, and other electronic health records?

Well, first of all—

The application of AI technology, specifically OCR, to administrative healthcare tasks can help healthcare providers trim operational costs by a whopping $16.3 billion

Modern artificial intelligence algorithms already saved over $122 billion for early adopters of process automation solutions. 

Automatically processed handwritten prescription with AI

Machine learning and artificial intelligence can be used to automate: 

  • Prior authorizations (still largely faxed-based) 
  • Healthcare claims management 
  • Patient records management 
  • Appointment booking, scheduling, and management 
  • Clinical records creation and processing 

Secondly, natural language processing (NLP), in particular, has proved to drive major productivity gains. 

Healthcare providers deal with vast volumes of structured and unstructured patient data that need to be captured in EHR/EMR systems. 

The problem, however, is that up to 80% of health data is unstructured, rendering traditional text mining algorithms ineffective. Respectively, a lot of big data in healthcare remains siloed and utilized. 

This negatively impacts patient outcomes. 

NLP systems can help providers process more information faster, including SDOH data (social determinants of health). 

Healthcare professionals say that this data contains important non-clinical insights that impact patients’ well-being and care delivery. But resurfacing it for analysis as it remains siloed across different mediums—spoken notes, patient portals, email exchanges, and EHR systems. 

“Medical care is estimated to account for approximately 20% of healthcare outcomes, which is an important reason why healthcare leaders should consider SDOH factors and their impact on patient health.”

says Dr. Elizabeth Marshall in an interview with Health Tech IT. 

According to Marshal, natural language processing algorithms can help access such data and translate it into insights for diagnosis and wider healthcare initiatives. These include care programs, better patient resources, and targeted interventions. 

Apart from improving patient outcomes, natural language processing can also help hospitals recoup some operational costs. 

For instance—

Concord hospital has recently adopted Nuance’s AI-powered software suite for speech recognition and real-time clinical records processing. The staff can now dictate notes from any workstation or a personal smartphone, and the data will be auto-transcribed and added to a respective patient file. 

The phone-based transcription usage was reduced by 91%, and the instruction recorded a saving of $1 million

NLP-powered applications can also improve patient care. 

Intelligent virtual assistants can converse with patients to collect symptom data while waiting and help identify more critical patients for immediate admissions. 

In addition, voice assistants can be more convenient for patients with limited mobility who require a quick consultation or adjustment in their room settings when hospitalized. 

7. AI for Ophthalmology

Lastly, let’s see how artificial intelligence can be used in ophthalmology.

Research shows that the healthcare systems are experiencing a deeper strain in providing timely ophthalmology care. Due to rapidly aging populations, the public healthcare systems cannot always guarantee timely access to specialists. 

OCT analysis AI annotation
Retinal layers visualized on V7 for OCT image labelling used in early-stage glaucoma detection

This can lead to poor patient outcomes. 

Delays in ophthalmology treatment can lead to irreversible harm such as visual field loss and visual acuity deterioration that could have been prevented with earlier diagnosis. 

Other types of eye diseases such as diabetic retinopathy (DR), cataract, glaucoma, age-related macular degeneration (AMD) also require timely intervention. 

Emerging AI and deep learning models use cases attempt to compensate for gaps in diagnosis and accelerate the providers’ ability to perform population-level screenings. 

  • Diabetic retinopathy detection: In 2018, FDA approved the first AI-powered system for autonomously diagnosing diabetic retinopathy. The IDx-DR system captures eye shots with a fundus camera and performs analysis in under a minute. The clinical trials demonstrated an 87% sensitivity and 90% specificity in detecting early signs of DR among patients. 
  • Glaucoma detection: A study conducted in 2020 also notes that AI methods can improve glaucoma diagnostic procedures. An algorithm trained on OCT images and fundus photography proved to be more effective for detecting glaucoma progression earlier while using data from a single VF test.
  • Cataract treatment management via telehealth: A group of Chinese researchers proposed a novel system for remote cataract progression monitoring and intervention. They used deep learning algorithms to create a smartphone-based AI assistant that patients can use for check-ins. The algorithm can analyze the changes in cataract progression, make personalized predictions, and help schedule timely visits and checkups.


Cataract treatment management via telehealth
Source: Nature 

The Future of AI In Healthcare

The growing availability of annotated healthcare datasets and medical imaging serves as a springboard from which new artificial intelligence and computer vision use cases emerge. 

Most emerging technologies are yet to be tested in wider clinical settings. Yet—

AI in healthcare has a bright future ahead.

Artificial intelligence, machine learning, and deep learning have proven useful in helping both healthcare providers and patients. Apart from The current (and future) applications include:

  • Revenue cycle management 
  • Robotic process automation
  • Drug development processes
  • Clinical documentation management
  • Claims processing
  • Treatment applications diagnosis

It can also analyze data on patient visits to the clinic, medications prescribed, lab tests, and procedures performed in the past.

Theoretical results and early pilots instill great confidence in computer science professionals and doctors joining forces to transform patient care and improve healthcare organizations across the globe.

However—

For AI in healthcare to go mainstream, wider healthcare sector transformations are required, too.

Hospitals will have to invest in new technology systems and staff upskilling. They will need to undertake substantial integration projects, and focus their efforts on the adoption of those technologies in their daily clinical practice.

💡 Read Next:

Computer Vision: Everything You Need to Know

65+ Best Free Datasets for Machine Learning

6 Viable AI Use Cases in Insurance

A Comprehensive Guide to Human Pose Estimation

V7 Go interface
Solve any task with GenAI

Automate repetitive tasks and complex processes with AI

Tomas is an entrepreneur, designer, and technology blogger from Lithuania. He developed a readership of over 1 million on his personal blog and writes for the Huffington Post and Forbes.

“Collecting user feedback and using human-in-the-loop methods for quality control are crucial for improving Al models over time and ensuring their reliability and safety. Capturing data on the inputs, outputs, user actions, and corrections can help filter and refine the dataset for fine-tuning and developing secure ML solutions.”
Name
Company
Automate repetitive tasks with V7's new Gen AI tool
Explore V7 Go
Ready to get started?
Try our trial or talk to one of our experts.
V7 Go Summer Release