Copyright
©The Author(s) 2025.
World J Gastroenterol. Oct 7, 2025; 31(37): 111327
Published online Oct 7, 2025. doi: 10.3748/wjg.v31.i37.111327
Published online Oct 7, 2025. doi: 10.3748/wjg.v31.i37.111327
Table 1 Glossary of technical terms used in artificial intelligence applications for gastric diseases
Technical term | Simplified definition |
CADe | AI systems that automatically identify and highlight abnormal areas during endoscopy |
CADx | AI systems that classify detected lesions into diagnostic categories (e.g., benign vs malignant) |
CNN | AI architecture designed to analyze visual information from endoscopic images |
Edge computing | Processing AI calculations directly on local devices rather than remote servers, enabling real-time analysis |
LLM | General-purpose AI systems like GPT-4 that can understand and generate human-like text |
Multi-agent architectures | Systems where multiple specialized AI components work together to solve complex clinical problems |
Context window | The amount of information (text, images) an AI model can analyze simultaneously |
One-shot learning | AI’s ability to learn from a single example, reducing the need for large training datasets |
Table 2 Representative systematic reviews and meta-analyses of artificial intelligence applications in gastric diseases (2019-2025)
Ref. | AI application | Included | Patients/images | Performance metrics | Key findings |
For gastric cancer detection or diagnosis | |||||
Xie et al[38] | CNN for diagnosis and invasion depth prediction in gastric cancer | 17 | 5539/51446 | Sensitivity of 89%, specificity of 93%, AUC of 0.94 | AI comparable to experts, superior to non-experts |
Shi et al[39] | ML for image-based identification of EGC | 21 | Not specified | Sensitivity of 90%, specificity of 90%, SROC of 0.96 | AI comparable to experts, superior to the combined group of expert and non-expert endoscopists |
Jiang et al[40] | AI for endoscopic detection and invasion depth prediction of EGC | 16 | 1708519 images/22621 EGC images | Sensitivity of 86%, specificity of 93%, AUC of 0.96 (detection); sensitivity of 72%, specificity of 79%, AUC of 0.82, (invasion depth diagnosis) | AI-assisted EGC diagnosis was more accurate than experts |
Klang et al[41] | CNN for detection and diagnosis of gastric cancer | 42 | Not specified | Not specified | AI models frequently matched or outperformed human endoscopists in diagnostic accuracy |
For diagnosis of H. pylori infection in endoscopic images | |||||
Bang et al[2] | Diagnostic test accuracy of AI for the prediction of H. pylori infection using endoscopic images | 8 | 1719/2855 endoscopic images with H. pylori infection and 2287 control images | Sensitivity of 87%, specificity of 86%, AUC of 0.92, DOR of 40 | An AI algorithm is a reliable tool for endoscopic diagnosis of H. pylori infection |
Mohan et al[42] | CNN-based AI in the diagnosis of H. pylori infection | 5 | 3558/10151 images | Sensitivity of 86.3%, specificity of 87.1%, accuracy of 87.1% (AI); sensitivity of 79.6%, specificity of 83.8%, accuracy of 82.9% (endoscopists) | AI model outperformed human endoscopists |
Dilaghi et al[43] | AI in the diagnosis of gastric precancerous lesions and H. pylori infection | 9 | 2430 | Accuracy of 79.6% | AI-system seems to be a good resource for an easier diagnosis of H. pylori infection |
Parkash et al[44] | Diagnostic accuracy of AI algorithms for detecting H. pylori infection using endoscopic images | 11 | Over 6122/over 107466 | Sensitivity of 93%, specificity of 92%, accuracy of 92% | AI had high diagnostic accuracy for detecting H. pylori infection using endoscopic images |
Jiang et al[45] | Diagnostic performance of AI based on endoscopy for detecting H. pylori infection | 16 | 25002 images or patients | Sensitivity of 91%, specificity of 94%, accuracy of 98% | AI demonstrates higher diagnostic performance compared to both novices and senior endoscopists |
For diagnosis of gastric precancerous lesions in endoscopic images | |||||
Li et al[47] | AI’s diagnostic accuracy in detecting gastric intestinal metaplasia in endoscopy | 12 | 11173 | Sensitivity of 94%, specificity of 93%, accuracy of 97% | AI exhibited a higher diagnostic capacity than endoscopists |
Shi et al[46] | Accuracy of AI-assisted diagnosis of gastric atrophy | 8 | 25216/84678 | Sensitivity of 94%, specificity of 96%, AUC of 0.98 | The accuracy of AI in diagnosing chronic atrophic gastritis was significantly higher than that of endoscopists |
For diagnosis of invasion depth in gastric cancer | |||||
Xie et al[38] | CNN for diagnosis and invasion depth prediction in gastric cancer | 17 | 5539/51446 | Sensitivity of 82%, specificity of 90%, AUC of 0.90 | AI comparable to experts, superior to non-experts |
For prediction of lymph node metastasis in gastric cancer using clinical data | |||||
Li et al[50] | Diagnostic performance of ML in predicting lymph node metastasis in patients with gastric cancer | 41 | 56182 | Accuracy of 75.3% | ML has shown to be of excellent diagnostic performance in predicting the lymph node metastasis of gastric cancer |
- Citation: Gong EJ, Woo J, Lee JJ, Bang CS. Role of artificial intelligence in gastric diseases. World J Gastroenterol 2025; 31(37): 111327
- URL: https://www.wjgnet.com/1007-9327/full/v31/i37/111327.htm
- DOI: https://dx.doi.org/10.3748/wjg.v31.i37.111327