Radiography, intraoral scans, and facial scans often present dental practitioners with a quantity of overwhelming and unstructured data.
The good news?
AI-driven dental imaging software can help make sense of the data quickly and efficiently.
ML is already being used for:
And more. The best part?
Machine learning algorithms also proved to outperform dentists in diagnosing tooth decay or predicting whether a tooth should be extracted, retained, or have restorative treatment.
Take a look at the breadth of ML research across a variety of dental fields:
Now, before you start worrying that a robot is going to replace the friendly human who looks after your teeth, know that ML and computer vision systems are being applied to support your dentist in giving the best treatment possible.
Here’s what we’ll cover:
In case you are curious about the AI applications in other industries, check out:
Ready to streamline AI product deployment right away? Check out:
Now let’s get right to it and see what’s in store for the future of dentistry!
Just as “two heads are better than one”, enlisting the help of additional (computer vision) eyes can improve dentists’ ability to identify and treat issues.
And sometimes that extra help is more valuable than you might expect. Let’s take a look at two examples: dental decay and periodontal disease.
Computer vision systems can detect dental decay using techniques like object detection and semantic segmentation.
One way this works is by training CNNs on large sets of images with labeled carious lesions. Once the model training is complete, the algorithms are ready to be fed raw data to identify those lesions on their own.
Why does it matter?
The correct identification of lesions (ieg. cavities) is key to early diagnosing tooth decay.
Dentists have been cavity detectives for ages. But—
How does ML stack up when it comes to identifying tooth decay?
To answer this, Pearl commissioned a study that compared the diagnostic consistency of three experienced dentists against a dental AI system.
The three human dentists examined and highlighted carious lesions in more than 8,000 radiographic images of bitewings and periapicals. One of the major aims of the study was to see how often the dentists agreed on the presence or absence of caries.
The results were rather unnerving.
The dentists unanimously agreed on the absence of tooth decay 79% of the time.
Not bad, but not exactly inspiring either.
More alarming is that they unanimously agreed on the presence of decay in only 4.2% of the cases!
Let that sink it.
While this is admittedly a low bar to set, the study found that the CV/ML tool predicted the existence of caries not only better than the dentist consensus rate but also better than when one dentist was used as “Ground Truth”, or when the intersection between two dentists’ annotations was used.
The conclusion?
It really wouldn’t be a bad idea for your dentist to check in with an AI colleague for a second opinion.
This is not to say that dentists are bad at their jobs. Rather, AI is just really good at analyzing and processing large amounts of complex data.
For example, dentalXrai Pro is an AI-powered program that helps dental practitioners analyze radiographs more accurately and consistently. It doesn’t replace dentists’ expertise but instead helps them to identify problems and potential treatments with even greater speed and precision.
The co-founder of dentalXrai Pro says:
AI is not responsible for the dental examination and does not reach decisions on the treatment. However, dentalXrai Pro raises dentistry to a standardized, high-quality level and immensely speeds up the analysis of X-rays, so that dentists can use the time more effectively for talking to patients
dentalXrai Pro achieves this thanks to artificial neural networks that can be trained with large sets of annotated dental radiographs.
The program can then learn to see the difference between dental anomalies such as caries, infections, and root canal fillings. It can then suggest an almost instantaneous diagnosis when shown a single digitized patient X-ray.
This saves the dentist time, increases diagnosis accuracy, and leaves more resources available for patient care.
Periodontal diseases are caused by bacteria that set up camp in your teeth and infect the surrounding soft tissue. This creates inflammation and leads to tooth loss.
Dentists can determine how advanced the disease is with a technique called depth probing.
Luckily, AI can help out by providing additional detection tools and automated depth probing.
In the first case, CNNs can perform image classification and image segmentation on radiographic images of periodontally compromised teeth. They then identify patterns and perform edge detection to determine the stage of the disease.
But that’s not all.
Researchers in this study came up with a Computer-Aided Periodontal Disease Diagnosis system using computer vision.
Essentially, the system “automates the depth probing and incorporates a color camera fitted together with a plastic probe that automatically and exactly obtains the depth probing measure”.
And... It works!
When the system was tested against the depth probing results of two human periodontists, the AI system’s diagnosis proved to be correct every time.
The upshot here is not that we’ll need fewer dentists in the future, but that AI can help dentists make more effective diagnoses and devote more energy to providing optimal treatment.
It’s kind of like how the internet saves you a trip to the library, which gives you more time to actually do something with the information you’ve found. Except, in this case, it’s giving the dentist a better opportunity to keep your teeth from falling out!
While losing a tooth can be traumatic, it pales in comparison to the effects of oral cancer.
In 2021, there were more than 10,800 deaths due to oral cavity or oropharyngeal cancer in the U.S alone.
This number is made even more tragic when considering that oral cancer’s survival rate is 83% when detected in its early stages.
Unfortunately, only 29% of cases are detected early.
What’s more, detecting the early signs of oral cancer is not particularly challenging.
Visible oral lesions called “oral potentially malignant disorders” (OPMDs) are a strong sign of cancer and can be detected in routine oral examinations by a general dentist. The problem is that this kind of examination just doesn’t occur frequently enough during dental checkups.
If only there were an efficient, cost-effective way to automate the detection of malignant or potentially malignant lesions…
At the moment, ML and CV technologies haven’t developed quite enough to help keep us safe from oral cancer. But with more access to properly labeled data, there’s a strong potential for it to get there.
In fact, several V7 users are already working hard on annotating dental data and building models for oral cancer detection.
A recent study on “Automated Detection and Classification of Oral Lesions Using Deep Learning” used two deep learning-based computer vision approaches to classify oral lesions.
It applied image classification with ResNet-101 and object detection with the Faster R-CNN to more than 2,000 images with the goal of answering two questions:
1) Are lesions present?
2) Do lesions pose a cancer risk?
While not perfect, the AI system yielded promising results.
Image classification managed an accuracy of 87% for detecting the presence of oral lesions and 78% accuracy for classifying those that needed referral for treatment. Object detection fared somewhat worse, with only 41% accuracy detecting lesions that posed a cancer risk.
Nevertheless, it’s a good start and more progress is being made.
Another study from 2020 used neural networks and microscopic images to detect signs of oral cancer.
The researchers compared several state-of-the-art AI models with a transfer learning approach. They then compiled and released an augmented dataset of high-quality microscopic images of oral cancer.
The result was that their transfer learning approach yielded 10-15% absolute improvement when compared with a simple Convolutional Neural Network (CNN) baseline.
So again, while the research and resources aren’t quite there yet for automated detection of oral cancer, they’re certainly headed in the right direction.
We met dental caries back in our discussion of tooth decay, but here we’ll look at it more in-depth. After all, “looking at” cavities is one of the most frequent in important jobs a dentist has.
Dental caries affect just about everybody.
Fissures in the occlusal surface of a tooth are fairly normal, and this is where cavities most often like to take hold.
If you take into account initial lesions as well as actively infected ones, 60-90% of school-aged children and nearly 100% of adults in most industrialized nations have some form of caries on their teeth.
As with oral cancer, early detection of dental caries is critical to preventing irrevocable harm. When cavities are treated early, it significantly reduces the cost of treatment, time of restoration, and risk of tooth loss.
Traditionally, dentists have identified dental caries with a simple visual inspection – most of us can probably relate to the experience of lying in a clinic chair while the dentist pokes around our teeth, all the while fearing we’ll hear, “Oh, I see a cavity!”
Recently, however, computer-aided detection and diagnosis systems (CAD) are making their way toward becoming a standard part of dental clinics. These systems can read dental X-rays and cone-beam computed tomography (CBCT) images for signs of oral pathology.
CAD systems are particularly good at using pictures of dental X-rays to spot dental caries that occur between teeth, which are often quite difficult to see with the naked eye. Further, computer vision-powered systems can estimate the depth of lesions and use this information to detect and classify caries.
A study published in August 2021 shows this type of AI detection in action. Deep learning with convolutional neural networks (CNNs) were used to detect and categorize dental caries using intraoral imaging.
A data set of 2,417 anonymized photographs of teeth were put into three categories (caries-free, non-cavitated caries lesion, or caries-related cavitation) and used to train the AI model using image augmentation and transfer learning.
To ensure reliable results, the image set was split into a training and test set. Results were then validated by selecting 25%, 50%, 75%, and 100% of the training set images.
So how well did our AI dental assistant do? Quite well!
Based on results from all the test images, the CNN managed to detect caries with 92.5% accuracy in standardized, single-tooth photographs.
Thus, this kind of AI-powered tool represents an efficient, accurate way for dentists to complement visual inspection and optimize caries detection. And this is good news since dental caries are bound to affect just about everybody at some point.
If you’ve ever had a root canal, then you’ve come face to face (literally) with endodontics.
Luckily, AI has applications that can help dentists detect and treat these dreaded pathologies even more effectively.
Endodontists typically use radiographic images to examine, measure, and evaluate the condition of the tooth down in the gums (ie. the root).
AI models can also look at these images and determine the structure, measurements, tissue viability, and even potential success of treatment for those out-of-sight portions of the tooth.
Deep learning algorithms can then detect, locate, and classify different aspects of tooth root anatomy and possible pathologies. This is useful for locating specific tooth structures or identifying particular kinds of fissures and lesions in or around the tooth.
In this paper, for example, researchers used CNNs to detect periapical lesions on CBCT images. They found that the AI model had an astonishing 92% reliability in this task.
And while detecting lesions in the tissue around the root is important, determining the length of the root in the gum (working length) is also critical to endodontic treatment.
There are several ways to achieve this—including just feeling the gum tissue—but reading dental radiographs is the most common. This is good news for AI since computer vision works well on radiographic images (as opposed to poking fingers in the mouth).
For example, this study reports on the use of electronic apex locators and CBCT imaging to locate the apical foramen using an AI model.
Accurate location of the apical foramen is crucial for determining the working length of a tooth, and the artificial neural networks (ANNs) used in the study were able to achieve both with 93% accuracy.
Once again, AI is getting consistent As on its tooth inspection tests!
So far we’ve seen that AI computer vision and machine learning can be of great help in diagnosing and treating issues in and around the teeth. But—
Can it help with moving teeth into the right place?
Indeed it can.
Orthodontics requires a significant amount of planning, and AI can be used to optimize analytics for predicting things like the size of unerupted teeth or the potential need for extraction.
Orthodontists also need to determine the best path for teeth to move in. AI algorithms can take a starting point and target endpoint and calculate the best way for a tooth or group of teeth to reach their optimal destination.
This saves dental practitioners a lot of time and increases the efficiency of tooth movement. As the old carpenter’s adage goes: “Measure twice, braces once.”
In addition to moving teeth, dental professionals can also use computer vision to diagnose bone pathologies in the mouth. For example, it’s already being widely used to diagnose and classify osteoarthritis in the temporomandibular joint.
So from dealing with a bit of pain to get your teeth straight to relieving an arthritic jaw, AI is proving to be a promising friend to dentists and orthodontists alike.
While all that we have covered has great potential for transforming dentistry, there are currently some limitations to the widespread application of AI in the dental industry.
The most significant is access to quality training data comprised of properly labeled dental datasets.
AI algorithms can only perform accurately if they’re given relevant data to learn from. This means using annotations such as bounding boxes or polygons to label objects of interest and highlight relevant areas.
In general, publicly available datasets are a great resource for AI training.
However, the trouble for dental AI isn’t an unwillingness or inability to compile data sets or access those currently available. Rather, the data needed for training artificial intelligence in dentistry needs to be anonymized, or collection requires patients’ consent.
Annotating medical data requires a HIPAA and FDA compliant image annotation platform, and V7 offers just this.
Have a look below at how you can label your medical imaging data with V7.
The data can be labeled in-house, or you can hire professional dental annotators to help you create ground truth faster.
V7 allows you to annotate X-Ray, CBCT, MRI images, and more. Go ahead and schedule a demo with our team to discuss your use case.
Machine learning and computer vision systems have incredible potential for future dental care, from improving early detection of oral cancer to boosting efficiency in orthodontics.
As we’ve seen, the importance of artificial intelligence for dentistry is hard to understate. When working in tandem with human dental practitioners, AI technologies improve diagnostic accuracy, reduce costs and improve long-term patient outcomes.
Another major benefit is AI’s ability to standardize dental diagnosis and treatment.
Dentists’ evaluation of patient data is subjective, and research has shown that diagnosis is not always consistent between practitioners. Smart, new technologies in dentistry provide a way to significantly increase consistency, and as a result, improve patient health.
In other words, CV and ML are poised to create a major win-win for patients and dentists alike.
If you are interested to learn more about AI applications across other industries, check out: