Artificial Intelligence in Gastrointestinal Endoscopy: A New Era of Precision Care
Imagine an endoscopist performing a colonoscopy, with a computer system instantly flagging tiny polyps they might miss—even after hours of procedure. That’s the promise of artificial intelligence (AI) in gastrointestinal (GI) endoscopy. As AI evolves from lab experiments to clinical tools, it’s poised to fix some of the biggest challenges in GI care: human error, variability between doctors, and the sheer volume of data from tests like capsule endoscopy.
Ahmad El Hajjar and Jean-François Rey, gastroenterologists at the Arnault Tzanck Institute in France, break down how AI is transforming endoscopy in a 2020 review in the Chinese Medical Journal. Their work highlights AI’s potential to make procedures faster, more accurate, and more reliable—benefiting both patients and doctors.
How AI Works in Endoscopy
At its core, AI uses machine learning (ML)—algorithms that “learn” from data—to recognize patterns in endoscopic images or videos. A subset called deep learning (DL) uses layered “neural networks” (inspired by the human brain) to analyze data logically, making it far more powerful than standard ML. For example, an AI trained on thousands of colon polyp images can spot tiny growths by detecting subtle features: color changes, microvascular patterns, or irregular surfaces.
AI systems fall into two main categories:
- Computer-assisted detection (CADe): Flags lesions (like polyps or bleeding) in real time.
- Computer-assisted diagnosis (CADx): Predicts what a lesion is (e.g., cancerous vs. benign) or how advanced it is—like an “optical biopsy.”
AI in the Esophagus: Catching Early Esophageal Cancer
Esophageal adenocarcinoma (EAC)—linked to obesity and Barrett’s esophagus (BE, a precancerous condition)—is often missed until it’s advanced, when survival rates are low. Current screening uses the Seattle protocol, which takes random biopsies of BE tissue. It’s slow, inefficient, and misses many early cancers.
AI changes this by analyzing detailed images to spot neoplastic (cancerous) changes:
- A 2017 study used volumetric laser endomicroscopy (VLE)—a tool that scans esophageal tissue 3mm deep—to train an AI. The algorithm detected early BE cancer with 95% accuracy, matching expert doctors.
- Another AI, trained on standard endoscopic images, identified early neoplastic lesions in BE patients with 86% sensitivity (catching most cases) and 87% specificity (avoiding false alarms).
- Even small cancers (under 10mm) weren’t missed: An AI using convolutional neural networks (CNNs)—a type of DL—had 95% sensitivity for esophageal cancer, outperforming human experts in some cases.
AI doesn’t just improve detection—it eases the pressure on endoscopists worried about missing life-threatening lesions.
AI in the Stomach: Early Gastric Cancer and Beyond
Gastric cancer is the third leading cause of cancer death globally, but early gastric cancer (EGC)—confined to the stomach’s inner layers—is often hard to spot. It can look like gastritis (inflammation) or have no obvious signs, leading to missed diagnoses.
AI helps at every step of EGC care:
- Diagnosis: A 2015 study used blue laser imaging (BLI)—a tool that highlights mucosal details—to train an AI system. It distinguished EGC from red, non-cancerous lesions with high accuracy, even for hard-to-see “superficially depressed” cancers.
- Staging: An AI trained on 902 endoscopic images predicted how deep EGC had invaded with 64.7% accuracy—helping doctors choose between endoscopic resection or surgery.
- H. pylori Detection: AI analyzes endoscopic images to predict Helicobacter pylori (a bacteria linked to gastritis and cancer) with 85–90% accuracy. This cuts down unnecessary biopsies and gives patients real-time results.
One standout study: An AI trained on 13,584 gastric cancer images correctly found 92% of cancers—including all invasive cases. It even spotted lesions that looked like gastritis, a common source of human error.
AI in Capsule Endoscopy: Taming Data Overload
Wireless video capsule endoscopy (VCE) is a game-changer for examining the small bowel—it’s painless, non-invasive, and captures detailed images of hard-to-reach areas. But it has a major flaw: A single capsule produces thousands of images, taking hours to review.
AI solves this by automatically flagging critical findings:
- Bleeding: Two studies used DL to detect GI bleeding in VCE images with over 99% accuracy. This cuts reading time from hours to minutes.
- Celiac Disease: AI analyzes intestinal villi (tiny finger-like projections) to spot the damage caused by celiac disease.
- Hookworms: An AI system detected intestinal hookworms in VCE images with 92% accuracy—spotting parasites that human readers might miss.
AI doesn’t replace doctors here—it augments them, ensuring no critical finding gets buried in data.
AI in the Colon: Fewer Missed Polyps, Less Unnecessary Surgery
Colorectal cancer (CRC) is the third most common cancer and the second leading cause of cancer death. Colonoscopy is its best defense—removing precancerous adenomas can reduce CRC risk by 80%. But doctors miss 6–27% of adenomas, and removing non-cancerous polyps (like hyperplastic growths) wastes time and money.
AI fixes these flaws:
- Polyp Detection: A 2018 study’s AI detected 94% of test polyps, helping even less experienced docs. When the AI’s confidence score exceeded 75%, it alerted the doctor with a red border—like a “second set of eyes.”
- Optical Biopsy: AI can predict if a small polyp is cancerous without removing it. A system called EndoBRAIN (trained on high-magnification endocytoscopy images) diagnosed colorectal lesions with 100% accuracy in validation tests. This lets doctors use a “diagnose and disregard” strategy for benign polyps, cutting down unnecessary procedures.
- Surgery Decisions: For T1 CRC (cancer that invades the submucosa), AI predicts if lymph node metastasis (LNM) is likely—avoiding unnecessary surgery. One AI analyzed 45 factors (age, tumor size, histology) and reduced unnecessary operations by 77%—better than American, European, or Japanese guidelines.
- IBD Care: AI even helps with inflammatory bowel disease (IBD). A 2019 study’s AI predicted persistent inflammation in ulcerative colitis patients with 91% accuracy—guiding treatment before damage worsens.
The Future of AI in Endoscopy
AI is close to real-world use. Systems like Olympus’ endocytoscope—a high-magnification endoscope—have been tested in large trials: It correctly predicted polyp pathology 98% of the time, matching tissue biopsies.
But challenges remain:
- Validation: Most AI studies are retrospective (using old data). Prospective trials (testing in real patients) are needed to confirm safety and efficacy.
- Usability: AI systems must be easy to use—no extra training or complicated steps.
- Regulation: Who’s liable if AI misses a cancer? How do we ensure AI systems are fair (not biased toward certain patient groups)?
These questions aren’t deal-breakers—they’re part of progress. As El Hajjar and Rey note, AI’s goal is to assist, not replace, doctors. It compensates for human limits—fatigue, experience gaps, data overload—and makes endoscopy safer for everyone.
Conclusion: AI as a Partner in Care
AI isn’t just a trend—it’s a breakthrough that will change GI endoscopy forever. It:
- Reduces inter-operator variability (ensuring consistent care regardless of a doctor’s experience).
- Catches hard-to-see lesions (like small cancers or polyps).
- Saves time and money (fewer unnecessary biopsies, shorter procedures).
- Eases the pressure on doctors (less worry about missing critical findings).
While more research and regulation are needed, the potential is clear. For patients, AI means fewer missed diagnoses and better outcomes. For doctors, it’s a powerful tool to do their best work.
As El Hajjar and Rey write: “AI has the potential to bring major improvements to GI endoscopy at all levels. It will make a big revolution in the development of GI endoscopy.”
The future of GI care is here—and it’s intelligent.
doi.org/10.1097/CM9.0000000000000623
Was this helpful?
0 / 0