1
|
Shinozaki S, Watanabe J, Kanno T, Yuan Y, Yano T, Yamamoto H. Computer-aided diagnosis for colorectal polyp in comparison with endoscopists: Systematic review and meta-analysis. Dig Endosc 2025. [PMID: 40375757 DOI: 10.1111/den.15047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/23/2024] [Accepted: 04/21/2025] [Indexed: 05/18/2025]
Abstract
OBJECTIVES Computer-aided diagnosis (CADx) is anticipated to enhance the prediction of colorectal polyp histology. This study aims to compare the diagnostic accuracy of CADx in the optical diagnosis of colorectal polyps, evaluating its performance against that of both experienced and inexperienced endoscopists. METHODS The protocol of this study was registered in the International Prospective Register of Systematic Reviews (PROSPERO) (ID: CRD42024585097). Three electronic databases including MEDLINE, Embase, and the Cochrane Central Register of Controlled Trials (CENTRAL) were searched in September 2024. A bivariate random effects model was employed. The primary outcome was the comparison of sensitivity and specificity between CADx and experienced endoscopists; the secondary outcome was the comparison between CADx and inexperienced endoscopists. RESULTS Twenty-one studies involving 5477 polyps were included. The pooled sensitivities of CADx and experienced endoscopists were 0.87 (95% confidence interval [CI] 0.82-0.91) and 0.88 (95% CI 0.83-0.91), respectively (P = 0.93). The pooled specificities of CADx and experienced endoscopists were 0.85 (95% CI 0.78-0.90) and 0.87 (95% CI 0.82-0.92), respectively (P = 0.53). In nine studies comparing CADx with inexperienced endoscopists, the pooled sensitivities were 0.88 (95% CI 0.82-0.92) for CADx and 0.85 (95% CI 0.78-0.90) for inexperienced endoscopists (P = 0.46). The pooled specificities were 0.84 (95% CI 0.78-0.88) for CADx and 0.77 (95% CI 0.70-0.83) for inexperienced endoscopists (P = 0.16). CONCLUSION Computer-aided diagnosis does not demonstrate superior diagnostic accuracy in optical diagnosis of colorectal polyps compared to endoscopists, regardless of their experience level.
Collapse
Affiliation(s)
- Satoshi Shinozaki
- Shinozaki Medical Clinic, Tochigi, Japan
- Division of Gastroenterology, Department of Medicine, Jichi Medical University, Tochigi, Japan
| | - Jun Watanabe
- Division of Gastroenterological, General and Transplant Surgery, Department of Surgery, Jichi Medical University, Tochigi, Japan
- Division of Community and Family Medicine, Jichi Medical University, Tochigi, Japan
| | - Takeshi Kanno
- R & D Division of Career Education for Medical Professionals, Medical Education Center, Jichi Medical University, Tochigi, Japan
- Division of Gastroenterology, Tohoku University Graduate School of Medicine, Miyagi, Japan
| | - Yuhong Yuan
- Department of Medicine, London Health Science Centre, London, ON, Canada
- Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Tomonori Yano
- Division of Gastroenterology, Department of Medicine, Jichi Medical University, Tochigi, Japan
| | - Hironori Yamamoto
- Division of Gastroenterology, Department of Medicine, Jichi Medical University, Tochigi, Japan
| |
Collapse
|
2
|
De Carvalho T, Kader R, Brandao P, Lovat LB, Mountney P, Stoyanov D. NICE polyp feature classification for colonoscopy screening. Int J Comput Assist Radiol Surg 2025; 20:1015-1024. [PMID: 40075052 PMCID: PMC12055651 DOI: 10.1007/s11548-025-03338-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2024] [Accepted: 02/12/2025] [Indexed: 03/14/2025]
Abstract
PURPOSE Colorectal cancer is one of the most prevalent cancers worldwide, highlighting the critical need for early and accurate diagnosis to reduce patient risks. Inaccurate diagnoses not only compromise patient outcomes but also lead to increased costs and additional time burdens for clinicians. Enhancing diagnostic accuracy is essential, and this study focuses on improving the accuracy of polyp classification using the NICE classification, which evaluates three key features: colour, vessels, and surface pattern. METHODS A multiclass classifier was developed and trained to independently classify each of the three features in the NICE classification. The approach prioritizes clinically relevant features rather than relying on handcrafted or obscure deep learning features, ensuring transparency and reliability for clinical use. The classifier was trained on internal datasets and tested on both internal and public datasets. RESULTS The classifier successfully classified the three polyp features, achieving an accuracy of over 92% on internal datasets and exceeding 88% on a public dataset. The high classification accuracy demonstrates the system's effectiveness in identifying the key features from the NICE classification. CONCLUSION This study underscores the potential of using an independent classification approach for NICE features to enhance clinical decision-making in colorectal cancer diagnosis. The method shows promise in improving diagnostic accuracy, which could lead to better patient outcomes and more efficient clinical workflows.
Collapse
Affiliation(s)
- Thomas De Carvalho
- Odin Vision, London, UK.
- Department of Computer Science, UCL Hawkes Institute, University College London, London, UK.
| | - Rawen Kader
- Division of Surgery and Interventional Science, University College London, London, UK
- Gastrointestinal Services, University College London Hospital, London, UK
| | | | - Laurence B Lovat
- Division of Surgery and Interventional Science, University College London, London, UK
- Gastrointestinal Services, University College London Hospital, London, UK
| | | | - Danail Stoyanov
- Department of Computer Science, UCL Hawkes Institute, University College London, London, UK
| |
Collapse
|
3
|
Huang C, Song Y, Dong J, Yang F, Guo J, Sun S. Diagnostic performance of AI-assisted endoscopy diagnosis of digestive system tumors: an umbrella review. Front Oncol 2025; 15:1519144. [PMID: 40248201 PMCID: PMC12003149 DOI: 10.3389/fonc.2025.1519144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2024] [Accepted: 03/18/2025] [Indexed: 04/19/2025] Open
Abstract
The diagnostic performance of artificial intelligence (AI)-assisted endoscopy for digestive tumors remains controversial. The objective of this umbrella review was to summarize the comprehensive evidence for the AI-assisted endoscopic diagnosis of digestive system tumors. We grouped the evidence according to the location of each digestive system tumor and performed separate subgroup analyses on the basis of the method of data collection and form of the data. We also compared the diagnostic performance of AI with that of experts and nonexperts. For early digestive system cancer and precancerous lesions, AI showed a high diagnostic performance in capsule endoscopy and esophageal squamous cell carcinoma. Additionally, AI-assisted endoscopic ultrasonography (EUS) had good diagnostic accuracy for pancreatic cancer. In the subgroup analysis, AI had a better diagnostic performance than experts for most digestive system tumors. However, the diagnostic performance of AI using video data requires improvement.
Collapse
Affiliation(s)
- Changwei Huang
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Yue Song
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Jize Dong
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Fan Yang
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Jintao Guo
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
- Engineering Research Center of Ministry of Education for Minimally Invasive Gastrointestinal Endoscopic Techniques, Shenyang, Liaoning, China
| | - Siyu Sun
- Department of Gastroenterology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
- Engineering Research Center of Ministry of Education for Minimally Invasive Gastrointestinal Endoscopic Techniques, Shenyang, Liaoning, China
| |
Collapse
|
4
|
Kirita K, Futagami S, Nakamura K, Agawa S, Ueki N, Higuchi K, Habiro M, Kawawa R, Kato Y, Tada T, Iwakiri K. Combination of artificial intelligence endoscopic diagnosis and Kimura-Takemoto classification determined by endoscopic experts may effectively evaluate the stratification of gastric atrophy in post-eradication status. DEN OPEN 2025; 5:e70029. [PMID: 39534404 PMCID: PMC11555298 DOI: 10.1002/deo2.70029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/27/2024] [Revised: 10/07/2024] [Accepted: 10/08/2024] [Indexed: 11/16/2024]
Abstract
Background Since it is difficult for expert endoscopists to diagnose early gastric cancer in post-eradication status, it may be critical to evaluate the stratification of high-risk groups using the advance of gastric atrophy or intestinal metaplasia. We tried to determine whether the combination of endoscopic artificial intelligence (AI) diagnosis for the evaluation of gastric atrophy could be a useful tool in both pre- and post-eradication status. Methods 270 Helicobacter pylori-positive outpatients (Study I) were enrolled and Study II was planned to compare patients (n = 72) with pre-eradication therapy with post-eradication therapy. Assessment of endoscopic appearance was evaluated by the Kyoto classification and Kimura-Takemoto classification. The trained neural network generated a continuous number between 0 and 1 for gastric atrophy. Results There were significant associations between the severity of gastric atrophy determined by AI endoscopic diagnosis and not having a regular arrangement of collecting venules in angle, visibility of vascular pattern, and mucus using Kyoto classification in H. pylori-positive gastritis. There were significant differences (p = 0.037 and p = 0.014) in the severity of gastric atrophy between the high-risk group and low-risk group based on the combination of Kimura-Takemoto classification and endoscopic AI diagnosis in pre- and post-eradication status. The area under the curve values of the severity of gastric atrophy (0.674) determined by the combination of Kimura-Takemoto classification and gastric atrophy determined by AI diagnosis was higher than that determined by Kimura-Takemoto classification alone in post-eradication status. Conclusion A combination of gastric atrophy determined by AI endoscopic diagnosis and Kimura-Takemoto classification may be a useful tool for the prediction of high-risk groups in post-eradication status.
Collapse
Affiliation(s)
- Kumiko Kirita
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Seiji Futagami
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Ken Nakamura
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Shuhei Agawa
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Nobue Ueki
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Kazutoshi Higuchi
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Mayu Habiro
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | - Rie Kawawa
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| | | | | | - Katsuhiko Iwakiri
- Department of GastroenterologyNippon Medical School HospitalGraduate School of MedicineTokyoJapan
| |
Collapse
|
5
|
Guo Z, Hu Y, Ge P, Chan IN, Yan T, Wong PK, Xu S, Li Z, Gao S. Enhancing colorectal polyp classification using gaze-based attention networks. PEERJ COMPUTER SCIENCE 2025; 11:e2780. [DOI: 10.7717/peerj-cs.2780] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2025]
Abstract
Colorectal polyps are potential precursor lesions of colorectal cancer. Accurate classification of colorectal polyps during endoscopy is crucial for early diagnosis and effective treatment. Automatic and accurate classification of colorectal polyps based on convolutional neural networks (CNNs) during endoscopy is vital for assisting endoscopists in diagnosis and treatment. However, this task remains challenging due to difficulties in the data acquisition and annotation processes, the poor interpretability of the data output, and the lack of widespread acceptance of the CNN models by clinicians. This study proposes an innovative approach that utilizes gaze attention information from endoscopists as an auxiliary supervisory signal to train a CNN-based model for the classification of colorectal polyps. Gaze information from the reading of endoscopic images was first recorded through an eye-tracker. Then, the gaze information was processed and applied to supervise the CNN model’s attention via an attention consistency module. Comprehensive experiments were conducted on a dataset that contained three types of colorectal polyps. The results showed that EfficientNet_b1 with supervised gaze information achieved an overall test accuracy of 86.96%, a precision of 87.92%, a recall of 88.41%, an F1 score of 88.16%, the area under the receiver operating characteristic (ROC) curve (AUC) is 0.9022. All evaluation metrics surpassed those of EfficientNet_b1 without gaze information supervision. The class activation maps generated by the proposed network also indicate that the endoscopist’s gaze-attention information, as auxiliary prior knowledge, increases the accuracy of colorectal polyp classification, offering a new solution to the field of medical image analysis.
Collapse
Affiliation(s)
- Zhenghao Guo
- School of Mechanical Engineering, Hubei University of Arts and Science, Xiangyang, China
| | - Yanyan Hu
- Xiangyang Central Hospital, Affiliated Hospital of Hubei University of Arts and Science, Xiangyang, China
| | - Peixuan Ge
- Department of Electromechanical Engineering, University of Macau, Taipa, Macao, China
| | - In Neng Chan
- Department of Electromechanical Engineering, University of Macau, Taipa, Macao, China
| | - Tao Yan
- School of Mechanical Engineering, Hubei University of Arts and Science, Xiangyang, China
- Xiangyang Central Hospital, Affiliated Hospital of Hubei University of Arts and Science, Xiangyang, China
- Department of Electromechanical Engineering, University of Macau, Taipa, Macao, China
| | - Pak Kin Wong
- Department of Electromechanical Engineering, University of Macau, Taipa, Macao, China
| | - Shaoyong Xu
- Xiangyang Central Hospital, Affiliated Hospital of Hubei University of Arts and Science, Xiangyang, China
| | - Zheng Li
- Xiangyang Central Hospital, Affiliated Hospital of Hubei University of Arts and Science, Xiangyang, China
| | - Shan Gao
- Xiangyang Central Hospital, Affiliated Hospital of Hubei University of Arts and Science, Xiangyang, China
| |
Collapse
|
6
|
Kumar A, Aravind N, Gillani T, Kumar D. Artificial intelligence breakthrough in diagnosis, treatment, and prevention of colorectal cancer – A comprehensive review. Biomed Signal Process Control 2025; 101:107205. [DOI: 10.1016/j.bspc.2024.107205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/08/2024]
|
7
|
Wang J, Feng H, Houssou Hounye A, Tang M, Shu Y, Hou M, Chen S. A Boundary-Enhanced Decouple Fusion Segmentation Network for Diagnosis of Adenomatous Polyps. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2025; 38:229-244. [PMID: 39037669 PMCID: PMC11811332 DOI: 10.1007/s10278-024-01195-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/06/2024] [Revised: 06/14/2024] [Accepted: 07/02/2024] [Indexed: 07/23/2024]
Abstract
Adenomatous polyps, a common premalignant lesion, are often classified into villous adenoma (VA) and tubular adenoma (TA). VA has a higher risk of malignancy, whereas TA typically grows slowly and has a lower likelihood of cancerous transformation. Accurate classification is essential for tailored treatment. In this study, we develop a deep learning-based approach for the localization and classification of adenomatous polyps using endoscopic images. Specifically, a pre-trained EGE-UNet is first adopted to extract regions of interest from original images. Multi-level feature maps are then extracted by the feature extraction pipeline (FEP). The deep-level features are fed into the Pyramid Pooling Module (PPM) to capture global contextual information, and the squeeze body edge (SBE) module is then used to decouple the body and edge parts of features, enabling separate analysis of their distinct characteristics. The Group Aggregation Bridge (GAB) and Boundary Enhancement Module (BEM) are then applied to enhance the body features and edge features, respectively, emphasizing their structural and morphological characteristics. By combining the features of the body and edge parts, the final output can be obtained. Experiments show the proposed method achieved promising results on two private datasets. For adenoma vs. non-adenoma classification, It achieved a mIoU of 91.41%, mPA of 96.33%, mHD of 11.63, and mASD of 2.33. For adenoma subclassification (non-adenomas vs. villous adenomas vs. tubular adenomas), it achieved a mIoU of 91.21%, mPA of 94.83%, mHD of 13.75, and mASD of 2.56. These results demonstrate the potential of our approach for precise adenomatous polyp classification.
Collapse
Affiliation(s)
- Jiaoju Wang
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China
- School of Mathematics and Statistics, Nanyang Normal University, Nanyang, 473061, Henan, China
| | - Haoran Feng
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China
| | - Alphonse Houssou Hounye
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China
| | - Meiling Tang
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China
| | - Yiming Shu
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China
| | - Muzhou Hou
- School of Mathematics and Statistics, Central South University, Changsha, 410083, Hunan, China.
| | - Shuijiao Chen
- Department of Gastroenterology, Xiangya Hospital of Central South University, Changsha, 410008, Hunan, China.
| |
Collapse
|
8
|
Parikh M, Tejaswi S, Girotra T, Chopra S, Ramai D, Tabibian JH, Jagannath S, Ofosu A, Barakat MT, Mishra R, Girotra M. Use of Artificial Intelligence in Lower Gastrointestinal and Small Bowel Disorders: An Update Beyond Polyp Detection. J Clin Gastroenterol 2025; 59:121-128. [PMID: 39774596 DOI: 10.1097/mcg.0000000000002115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/11/2025]
Abstract
Machine learning and its specialized forms, such as Artificial Neural Networks and Convolutional Neural Networks, are increasingly being used for detecting and managing gastrointestinal conditions. Recent advancements involve using Artificial Neural Network models to enhance predictive accuracy for severe lower gastrointestinal (LGI) bleeding outcomes, including the need for surgery. To this end, artificial intelligence (AI)-guided predictive models have shown promise in improving management outcomes. While much literature focuses on AI in early neoplasia detection, this review highlights AI's role in managing LGI and small bowel disorders, including risk stratification for LGI bleeding, quality control, evaluation of inflammatory bowel disease, and video capsule endoscopy reading. Overall, the integration of AI into routine clinical practice is still developing, with ongoing research aimed at addressing current limitations and gaps in patient care.
Collapse
Affiliation(s)
| | - Sooraj Tejaswi
- University of California, Davis
- Sutter Health, Sacramento
| | | | | | | | | | | | | | | | | | | |
Collapse
|
9
|
Weng Z, Wang C, Liu B, Yang Y, Zhang Y, Zhang C. Integrated analysis of bioinformatics, mendelian randomization, and experimental validation reveals novel diagnostic and therapeutic targets for osteoarthritis: progesterone as a potential therapeutic agent. J Orthop Surg Res 2025; 20:85. [PMID: 39849508 PMCID: PMC11755849 DOI: 10.1186/s13018-025-05459-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/08/2024] [Accepted: 01/03/2025] [Indexed: 01/25/2025] Open
Abstract
BACKGROUND Osteoarthritis (OA), characterized by progressive degeneration of cartilage and reactive proliferation of subchondral bone, stands as a prevalent condition in orthopedic clinics. However, the precise mechanisms underlying OA pathogenesis remain inadequately explored. METHODS In this study, Random Forest (RF), Least Absolute Shrinkage and Selection Operator (LASSO), and Support Vector Machine-Recursive Feature Elimination (SVM-RFE) machine learning techniques were employed to identify hub genes. Based on these hub genes, an Artificial Neural Network (ANN) diagnostic model was constructed. The Drug Signatures Database (DSigDB) was utilized to screen small-molecule drugs targeting these hub genes, and molecular docking analyses and molecular dynamics simulations were employed to explore and validate the binding interactions between proteins and small-molecule drugs. Expression changes of the hub genes under inflammatory conditions were validated through in vitro experiments, including RT-qPCR and Western blotting, and the therapeutic effects of the identified small-molecule drug on chondrocytes under inflammatory conditions were further verified in vitro. Lastly, Mendelian randomization analysis was conducted to examine the causal association between progesterone levels and various OA phenotypes. RESULTS In this study, we identified three hub genes: interleukin 1 receptor-associated kinase 3 (IRAK3), integrin subunit beta-like 1 (ITGBL1), and Ras homolog family member U (RHOU). An Artificial Neural Network (ANN) diagnostic model constructed based on these hub genes demonstrated excellent performance in both training and validation phases. Screening with the Drug Signatures Database (DSigDB) identified progesterone as a small-molecule drug targeting these key proteins. Molecular docking analysis using AutoDock Vina revealed that progesterone exhibited binding energies of ≤ -7 kcal/mol with each of the key proteins, indicating strong binding affinity. Furthermore, molecular dynamics simulations validated the stability and strength of these interactions. RT-qPCR and Western blotting confirmed the downregulation of the hub genes in IL-1β-treated chondrocytes. Western blotting also demonstrated the potential therapeutic effects of progesterone on IL-1β-treated chondrocytes. Finally, Mendelian randomization analysis established a significant association between progesterone levels and multiple OA phenotypes. CONCLUSION In our study, IRAK3, ITGBL1, and RHOU were identified as potential novel diagnostic and therapeutic targets for OA. Progesterone was preliminarily validated as a small-molecule drug with potential effects on OA. Further research is crucial to elucidate the pathogenesis of OA and the specific therapeutic mechanisms involved.
Collapse
Affiliation(s)
- Ziyu Weng
- Department of Orthopedic Surgery, Zhongshan Hospital, Fudan University, Shanghai, 200032, China
| | - Chenzhong Wang
- Department of Orthopedic Surgery, Zhongshan Hospital, Fudan University, Shanghai, 200032, China
| | - Bo Liu
- Department of Orthopedic Surgery, Zhongshan Hospital, Fudan University, Shanghai, 200032, China
| | - Yi Yang
- Department of Orthopedic Surgery, Zhongshan Hospital, Fudan University, Shanghai, 200032, China
| | - Yueqi Zhang
- Department of Traumatic Surgery, School of Medicine, Shanghai East Hospital, Tongji University, Shanghai, China.
| | - Chi Zhang
- Department of Orthopedic Surgery, Zhongshan Hospital, Fudan University, Shanghai, 200032, China.
| |
Collapse
|
10
|
Kusters CHJ, Jaspers TJM, Boers TGW, Jong MR, Jukema JB, Fockens KN, de Groof AJ, Bergman JJ, van der Sommen F, De With PHN. Will Transformers change gastrointestinal endoscopic image analysis? A comparative analysis between CNNs and Transformers, in terms of performance, robustness and generalization. Med Image Anal 2025; 99:103348. [PMID: 39298861 DOI: 10.1016/j.media.2024.103348] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2023] [Revised: 07/10/2024] [Accepted: 09/10/2024] [Indexed: 09/22/2024]
Abstract
Gastrointestinal endoscopic image analysis presents significant challenges, such as considerable variations in quality due to the challenging in-body imaging environment, the often-subtle nature of abnormalities with low interobserver agreement, and the need for real-time processing. These challenges pose strong requirements on the performance, generalization, robustness and complexity of deep learning-based techniques in such safety-critical applications. While Convolutional Neural Networks (CNNs) have been the go-to architecture for endoscopic image analysis, recent successes of the Transformer architecture in computer vision raise the possibility to update this conclusion. To this end, we evaluate and compare clinically relevant performance, generalization and robustness of state-of-the-art CNNs and Transformers for neoplasia detection in Barrett's esophagus. We have trained and validated several top-performing CNNs and Transformers on a total of 10,208 images (2,079 patients), and tested on a total of 7,118 images (998 patients) across multiple test sets, including a high-quality test set, two internal and two external generalization test sets, and a robustness test set. Furthermore, to expand the scope of the study, we have conducted the performance and robustness comparisons for colonic polyp segmentation (Kvasir-SEG) and angiodysplasia detection (Giana). The results obtained for featured models across a wide range of training set sizes demonstrate that Transformers achieve comparable performance as CNNs on various applications, show comparable or slightly improved generalization capabilities and offer equally strong resilience and robustness against common image corruptions and perturbations. These findings confirm the viability of the Transformer architecture, particularly suited to the dynamic nature of endoscopic video analysis, characterized by fluctuating image quality, appearance and equipment configurations in transition from hospital to hospital. The code is made publicly available at: https://github.com/BONS-AI-VCA-AMC/Endoscopy-CNNs-vs-Transformers.
Collapse
Affiliation(s)
- Carolus H J Kusters
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands.
| | - Tim J M Jaspers
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Tim G W Boers
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Martijn R Jong
- Department of Gastroenterology and Hepatology, Amsterdam UMC, University of Amsterdam, Amsterdam, The Netherlands
| | - Jelmer B Jukema
- Department of Gastroenterology and Hepatology, Amsterdam UMC, University of Amsterdam, Amsterdam, The Netherlands
| | - Kiki N Fockens
- Department of Gastroenterology and Hepatology, Amsterdam UMC, University of Amsterdam, Amsterdam, The Netherlands
| | - Albert J de Groof
- Department of Gastroenterology and Hepatology, Amsterdam UMC, University of Amsterdam, Amsterdam, The Netherlands
| | - Jacques J Bergman
- Department of Gastroenterology and Hepatology, Amsterdam UMC, University of Amsterdam, Amsterdam, The Netherlands
| | - Fons van der Sommen
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Peter H N De With
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
11
|
Chen X, Xu B, Wei B, Ji L, Yang C, Zhan Q. Relationship Between Adenoma Detection Rate and Respective Withdrawal Time in Different Colon Segments: A Retrospective, Single-Center Study. JGH Open 2025; 9:e70095. [PMID: 39781025 PMCID: PMC11708806 DOI: 10.1002/jgh3.70095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2024] [Revised: 12/11/2024] [Accepted: 01/02/2025] [Indexed: 01/11/2025]
Abstract
Background and Aims The 6-min withdrawal time for colonoscopy is widely considered the standard of care. However, there may not be appropriate if the 6-min is equally divided into various colon segments. Since the adenoma detection in each colon segment is not the same, there may be differences with the withdrawal time in different colon segments. Our objective was to evaluate the relationships between adenoma detection rate (ADR) and respective withdrawal time in different colon segments. Methods Outpatients, age range 18-75 years, undertaking complete colonoscopy were enrolled in this study from November 2019 to November 2020 in the digestive endoscopy center. The entire colon was divided into four different segments: ascending colon, transverse colon, descending colon and rectosigmoid colon. The respective withdrawal time and ADR in each colon segment were recorded respectively. Results A total of 586 outpatients (279 males, 307 females) enrolled in this study and the general ADR was 38.2%. The positive withdrawal time (adenomas detected) was longer than negative withdrawal time (non-adenomas detected) (334.04 ± 24.21 s vs. 303.65 ± 5.20 s, t = 1.26, p < 0.001). ADR in ascending colon, transverse colon, descending colon and rectosigmoid colon were respectively 30.5%, 2.9%, 3.1% and 7.5%. While all of their positive withdrawal time were longer than negative withdrawal time (94.34 ± 33.76 s vs. 70.40 ± 41.84 s, t = 3.31, p = 0.001; 85.40 ± 49.76 s vs. 71.66 ± 36.87 s, t = 1.95, p = 0.025; 80.29 ± 39.85 s vs. 69.73 ± 35.96 s, t = 1.40, p = 0.016;100.95 ± 55.92 s vs. 80.96 ± 42.87 s, t = 3.61; p < 0.001, respectively). The withdrawal time threshold in the ascending colon, transverse colon, descending colon, rectosigmoid colon determined by receiver operating characteristic (ROC) curve were 77, 61, 56 and 109 s, respectively. In the ascending colon, ADR was significantly higher (47.0% vs. 33.1%, p < 0.001) when the colonoscopy withdrawal time was ≥ 77 s. When the withdrawal time was ≥ 61 s in the transverse colon (42.7% vs. 32.7%, p = 0.013), ≥ 59 s in the descending colon (42.3% vs. 29.9%, p = 0.004) and ≥ 109 s in rectosigmoid colon (52.2% vs. 33.9%, p < 0.001), ADR was also significantly higher. After adjusting for age, sex and BMI, Logistic regression analysis showed that withdrawal time ≥ 77 s in the ascending colon (OR, 1.796; 95% CI, 1.273-2.532; p < 0.001), ≥ 61 s in the transverse colon (OR, 1.535; 95% CI, 1.094-2.155; p = 0.013), ≥ 56 s in the descending colon (OR, 1.722; 95% CI, 1.193-2.486; p = 0.004) and ≥ 109 s in the rectosigmoid colon (OR, 2.134; 95% CI, 1.446-2.350; p < 0.001) were independent risk factors for the increase of ADR. Conclusions ADR and withdrawal time are all various in individual colon segments. During the operation of colonoscopy, withdrawal time in the ascending colon may be shortened appropriately. The adenomas in the rectosigmoid colon are more likely to be detected and do not take longer withdrawal times. We need to choose the appropriate time according to different colon segments.
Collapse
Affiliation(s)
- Xujin Chen
- Department of GastroenterologyThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| | - Bingxin Xu
- Department of GastroenterologyThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| | - Bingni Wei
- Department of GastroenterologyThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| | - Lin Ji
- Department of GastroenterologyThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| | - Cheng Yang
- Department of Digestive Endoscopy CenterThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| | - Qiang Zhan
- Department of GastroenterologyThe Affiliated Wuxi People's Hospital of Nanjing Medical University, Wuxi People's Hospital, Wuxi Medical Center, Nanjing Medical UniversityWuxiJiangsuChina
| |
Collapse
|
12
|
Wang YP, Jheng YC, Hou MC, Lu CL. The optimal labelling method for artificial intelligence-assisted polyp detection in colonoscopy. J Formos Med Assoc 2024:S0929-6646(24)00582-5. [PMID: 39730273 DOI: 10.1016/j.jfma.2024.12.022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2023] [Revised: 12/12/2024] [Accepted: 12/16/2024] [Indexed: 12/29/2024] Open
Abstract
BACKGROUND The methodology in colon polyp labeling in establishing database for ma-chine learning is not well-described and standardized. We aimed to find out the best annotation method to generate the most accurate model in polyp detection. METHODS 3542 colonoscopy polyp images were obtained from endoscopy database of a tertiary medical center. Two experienced endoscopists manually annotated the polyp with (1) exact outline segmentation and (2) using a standard rectangle box close to the polyp margin, and extending 10%, 20%, 30%, 40% and 50% longer in both width and length of the standard rectangle for AI modeling setup. The images were randomly divided into training and validation sets in 4:1 ratio. U-Net convolutional network architecture was used to develop automatic segmentation machine learning model. Another unrelated verification set was established to evaluate the performance of polyp detection by different segmentation methods. RESULTS Extending the bounding box to 20% of the polyp margin represented the best performance in accuracy (95.42%), sensitivity (94.84%) and F1-score (95.41%). Exact outline segmentation model showed the excellent performance in sensitivity (99.6%) and the worst precision (77.47%). The 20% model was the best among the 6 models. (confidence interval = 0.957-0.985; AUC = 0.971). CONCLUSIONS Labelling methodology affect the predictability of AI model in polyp detection. Extending the bounding box to 20% of the polyp margin would result in the best polyp detection predictive model based on AUC data. It is mandatory to establish a standardized way in colon polyp labeling for comparison of the precision of different AI models.
Collapse
Affiliation(s)
- Yen-Po Wang
- Endoscopy Center for Diagnosis and Treatment, Taipei Veterans General Hospital, Taiwan; Division of Gastroenterology, Taipei Veterans General Hospital, Taiwan; Institute of Brain Science, National Yang Ming Chiao Tung University School of Medicine, Taiwan; Faculty of Medicine, National Yang Ming Chiao Tung University School of Medicine, Taiwan
| | - Ying-Chun Jheng
- Department of Medical Research, Taipei Veterans General Hospital, Taiwan; Big Data Center, Taipei Veterans General Hospital, Taiwan
| | - Ming-Chih Hou
- Endoscopy Center for Diagnosis and Treatment, Taipei Veterans General Hospital, Taiwan; Division of Gastroenterology, Taipei Veterans General Hospital, Taiwan; Faculty of Medicine, National Yang Ming Chiao Tung University School of Medicine, Taiwan
| | - Ching-Liang Lu
- Endoscopy Center for Diagnosis and Treatment, Taipei Veterans General Hospital, Taiwan; Division of Gastroenterology, Taipei Veterans General Hospital, Taiwan; Institute of Brain Science, National Yang Ming Chiao Tung University School of Medicine, Taiwan.
| |
Collapse
|
13
|
Misawa M, Kudo SE. Current Status of Artificial Intelligence Use in Colonoscopy. Digestion 2024; 106:138-145. [PMID: 39724867 DOI: 10.1159/000543345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/27/2024] [Accepted: 12/24/2024] [Indexed: 12/28/2024]
Abstract
BACKGROUND Artificial intelligence (AI) has significantly impacted medical imaging, particularly in gastrointestinal endoscopy. Computer-aided detection and diagnosis systems (CADe and CADx) are thought to enhance the quality of colonoscopy procedures. SUMMARY Colonoscopy is essential for colorectal cancer screening but often misses a significant percentage of adenomas. AI-assisted systems employing deep learning offer improved detection and differentiation of colorectal polyps, potentially increasing adenoma detection rates by 8%-10%. The main benefit of CADe is in detecting small adenomas, whereas it has a limited impact on advanced neoplasm detection. Recent advancements include real-time CADe systems and CADx for histopathological predictions, aiding in the differentiation of neoplastic and nonneoplastic lesions. Biases such as the Hawthorne effect and potential overdiagnosis necessitate large-scale clinical trials to validate the long-term benefits of AI. Additionally, novel concepts such as computer-aided quality improvement systems are emerging to address limitations facing current CADe systems. KEY MESSAGES Despite the potential of AI for enhancing colonoscopy outcomes, its effectiveness in reducing colorectal cancer incidence and mortality remains unproven. Further prospective studies are essential to establish the overall utility and clinical benefits of AI in colonoscopy.
Collapse
Affiliation(s)
- Masashi Misawa
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Tsuzuki, Yokohama, Japan
| | - Shin-Ei Kudo
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Tsuzuki, Yokohama, Japan
| |
Collapse
|
14
|
Li S, Xu M, Meng Y, Sun H, Zhang T, Yang H, Li Y, Ma X. The application of the combination between artificial intelligence and endoscopy in gastrointestinal tumors. MEDCOMM – ONCOLOGY 2024; 3. [DOI: 10.1002/mog2.91] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/29/2023] [Accepted: 09/03/2024] [Indexed: 01/04/2025]
Abstract
AbstractGastrointestinal (GI) tumors have always been a major type of malignant tumor and a leading cause of tumor‐related deaths worldwide. The main principles of modern medicine for GI tumors are early prevention, early diagnosis, and early treatment, with early diagnosis being the most effective measure. Endoscopy, due to its ability to visualize lesions, has been one of the primary modalities for screening, diagnosing, and treating GI tumors. However, a qualified endoscopist often requires long training and extensive experience, which to some extent limits the wider use of endoscopy. With advances in data science, artificial intelligence (AI) has brought a new development direction for the endoscopy of GI tumors. AI can quickly process large quantities of data and images and improve diagnostic accuracy with some training, greatly reducing the workload of endoscopists and assisting them in early diagnosis. Therefore, this review focuses on the combined application of endoscopy and AI in GI tumors in recent years, describing the latest research progress on the main types of tumors and their performance in clinical trials, the application of multimodal AI in endoscopy, the development of endoscopy, and the potential applications of AI within it, with the aim of providing a reference for subsequent research.
Collapse
Affiliation(s)
- Shen Li
- Department of Biotherapy Cancer Center, West China Hospital, West China Medical School Sichuan University Chengdu China
| | - Maosen Xu
- Laboratory of Aging Research and Cancer Drug Target, State Key Laboratory of Biotherapy, West China Hospital, National Clinical Research, Sichuan University Chengdu Sichuan China
| | - Yuanling Meng
- West China School of Stomatology Sichuan University Chengdu Sichuan China
| | - Haozhen Sun
- College of Life Sciences Sichuan University Chengdu Sichuan China
| | - Tao Zhang
- Department of Biotherapy Cancer Center, West China Hospital, West China Medical School Sichuan University Chengdu China
| | - Hanle Yang
- Department of Biotherapy Cancer Center, West China Hospital, West China Medical School Sichuan University Chengdu China
| | - Yueyi Li
- Department of Biotherapy Cancer Center, West China Hospital, West China Medical School Sichuan University Chengdu China
| | - Xuelei Ma
- Department of Biotherapy Cancer Center, West China Hospital, West China Medical School Sichuan University Chengdu China
| |
Collapse
|
15
|
Labaki C, Uche-Anya EN, Berzin TM. Artificial Intelligence in Gastrointestinal Endoscopy. Gastroenterol Clin North Am 2024; 53:773-786. [PMID: 39489586 DOI: 10.1016/j.gtc.2024.08.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2024]
Abstract
Recent advancements in artificial intelligence (AI) have significantly impacted the field of gastrointestinal (GI) endoscopy, with applications spanning a wide range of clinical indications. The central goals for AI in GI endoscopy are to improve endoscopic procedural performance and quality assessment, optimize patient outcomes, and reduce administrative burden. Despite early progress, such as Food and Drug Administration approval of the first computer-aided polyp detection system in 2021, there are numerous important challenges to be faced on the path toward broader adoption of AI algorithms in clinical endoscopic practice.
Collapse
Affiliation(s)
- Chris Labaki
- Department of Internal Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, 300 Brookline Avenue, Boston, MA, USA
| | - Eugenia N Uche-Anya
- Division of Gastroenterology, Massachusetts General Hospital, Harvard Medical School, 55 Fruit Street, Boston, MA, USA
| | - Tyler M Berzin
- Center for Advanced Endoscopy, Division of Gastroenterology, Beth Israel Deaconess Medical Center, Harvard Medical School, 330 Brookline Avenue, Boston, MA, USA.
| |
Collapse
|
16
|
Djinbachian R, Rex DK, von Renteln D. Optical Polyp Diagnosis in the Era or Artificial Intelligence. Am J Gastroenterol 2024:00000434-990000000-01436. [PMID: 39526672 DOI: 10.14309/ajg.0000000000003195] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/07/2024] [Accepted: 11/05/2024] [Indexed: 11/16/2024]
Abstract
The development of new image enhancement modalities and improved endoscopic imaging quality has not led to increased adoption of resect-and-discard in routine practice. Studies have shown that endoscopists have the capacity to achieve quality thresholds to perform optical diagnosis; however, this has not led to acceptance of optical diagnosis as a replacement for pathology for diminutive (1-5 mm) polyps. In recent years, artificial intelligence (AI)-based computer-assisted characterization of diminutive polyps has recently emerged as a strategy that could potentially represent a breakthrough technology to enable widespread adoption of resect-and-discard. Recent evidence suggests that pathology-based diagnosis is suboptimal, as polyp nonretrieval, fragmentation, sectioning errors, incorrect diagnosis as "normal mucosa," and interpathologist variability limit the efficacy of pathology for the diagnosis of 1-5 mm polyps. New paradigms in performing polyp diagnosis with or without AI have emerged to compete with pathology in terms of efficacy. Strategies, such as autonomous AI, AI-assisted human diagnosis, AI-unassisted human diagnosis, and combined strategies have been proposed as potential paradigms for resect-and-discard, although further research is still required to determine the optimal strategy. Implementation studies with high patient acceptance, where polyps are truly being discarded without histologic diagnosis, are paving the way toward normalizing resect-and-discard in routine clinical practice. Ultimately the largest challenges for computer-assisted characterization remain liability perceptions from endoscopists. The potential benefits of AI-based resect-and-discard are many, with very little potential harm. Real-world implementation studies are therefore required to pave the way for the acceptability of such strategies in routine practice.
Collapse
Affiliation(s)
- Roupen Djinbachian
- Division of Gastroenterology, University of Montreal Hospital Center (CHUM), Montreal, Quebec, Canada
- Division of Gastroenterology, University of Montreal Hospital Research Center (CRCHUM), Montreal, Quebec, Canada
| | - Douglas K Rex
- Division of Gastroenterology, Indiana University School of Medicine, Indianapolis, Indiana, USA
| | - Daniel von Renteln
- Division of Gastroenterology, University of Montreal Hospital Center (CHUM), Montreal, Quebec, Canada
- Division of Gastroenterology, University of Montreal Hospital Research Center (CRCHUM), Montreal, Quebec, Canada
| |
Collapse
|
17
|
Mota J, Almeida MJ, Martins M, Mendes F, Cardoso P, Afonso J, Ribeiro T, Ferreira J, Fonseca F, Limbert M, Lopes S, Macedo G, Castro Poças F, Mascarenhas M. Artificial Intelligence in Coloproctology: A Review of Emerging Technologies and Clinical Applications. J Clin Med 2024; 13:5842. [PMID: 39407902 PMCID: PMC11477032 DOI: 10.3390/jcm13195842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2024] [Revised: 09/21/2024] [Accepted: 09/22/2024] [Indexed: 10/20/2024] Open
Abstract
Artificial intelligence (AI) has emerged as a transformative tool across several specialties, namely gastroenterology, where it has the potential to optimize both diagnosis and treatment as well as enhance patient care. Coloproctology, due to its highly prevalent pathologies and tremendous potential to cause significant mortality and morbidity, has drawn a lot of attention regarding AI applications. In fact, its application has yielded impressive outcomes in various domains, colonoscopy being one prominent example, where it aids in the detection of polyps and early signs of colorectal cancer with high accuracy and efficiency. With a less explored path but equivalent promise, AI-powered capsule endoscopy ensures accurate and time-efficient video readings, already detecting a wide spectrum of anomalies. High-resolution anoscopy is an area that has been growing in interest in recent years, with efforts being made to integrate AI. There are other areas, such as functional studies, that are currently in the early stages, but evidence is expected to emerge soon. According to the current state of research, AI is anticipated to empower gastroenterologists in the decision-making process, paving the way for a more precise approach to diagnosing and treating patients. This review aims to provide the state-of-the-art use of AI in coloproctology while also reflecting on future directions and perspectives.
Collapse
Affiliation(s)
- Joana Mota
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - Maria João Almeida
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - Miguel Martins
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - Francisco Mendes
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - Pedro Cardoso
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - João Afonso
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - Tiago Ribeiro
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
| | - João Ferreira
- Department of Mechanical Engineering, Faculty of Engineering, University of Porto, 4200-065 Porto, Portugal;
- DigestAID—Digestive Artificial Intelligence Development, Rua Alfredo Allen n.° 455/461, 4200-135 Porto, Portugal
| | - Filipa Fonseca
- Instituto Português de Oncologia de Lisboa Francisco Gentil (IPO Lisboa), 1099-023 Lisboa, Portugal; (F.F.); (M.L.)
| | - Manuel Limbert
- Instituto Português de Oncologia de Lisboa Francisco Gentil (IPO Lisboa), 1099-023 Lisboa, Portugal; (F.F.); (M.L.)
- Artificial Intelligence Group of the Portuguese Society of Coloproctology, 1050-117 Lisboa, Portugal;
| | - Susana Lopes
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
- Artificial Intelligence Group of the Portuguese Society of Coloproctology, 1050-117 Lisboa, Portugal;
- Faculty of Medicine, University of Porto, 4200-047 Porto, Portugal
| | - Guilherme Macedo
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
- Faculty of Medicine, University of Porto, 4200-047 Porto, Portugal
| | - Fernando Castro Poças
- Artificial Intelligence Group of the Portuguese Society of Coloproctology, 1050-117 Lisboa, Portugal;
- Department of Gastroenterology, Santo António University Hospital, 4099-001 Porto, Portugal
- Abel Salazar Biomedical Sciences Institute (ICBAS), 4050-313 Porto, Portugal
| | - Miguel Mascarenhas
- Precision Medicine Unit, Department of Gastroenterology, São João University Hospital, 4200-427 Porto, Portugal; (J.M.); (M.J.A.); (M.M.); (F.M.); (P.C.); (J.A.); (T.R.); (S.L.); (G.M.)
- WGO Gastroenterology and Hepatology Training Center, 4200-047 Porto, Portugal
- Artificial Intelligence Group of the Portuguese Society of Coloproctology, 1050-117 Lisboa, Portugal;
- Faculty of Medicine, University of Porto, 4200-047 Porto, Portugal
| |
Collapse
|
18
|
Spadaccini M, Troya J, Khalaf K, Facciorusso A, Maselli R, Hann A, Repici A. Artificial Intelligence-assisted colonoscopy and colorectal cancer screening: Where are we going? Dig Liver Dis 2024; 56:1148-1155. [PMID: 38458884 DOI: 10.1016/j.dld.2024.01.203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/12/2023] [Revised: 01/22/2024] [Accepted: 01/23/2024] [Indexed: 03/10/2024]
Abstract
Colorectal cancer is a significant global health concern, necessitating effective screening strategies to reduce its incidence and mortality rates. Colonoscopy plays a crucial role in the detection and removal of colorectal neoplastic precursors. However, there are limitations and variations in the performance of endoscopists, leading to missed lesions and suboptimal outcomes. The emergence of artificial intelligence (AI) in endoscopy offers promising opportunities to improve the quality and efficacy of screening colonoscopies. In particular, AI applications, including computer-aided detection (CADe) and computer-aided characterization (CADx), have demonstrated the potential to enhance adenoma detection and optical diagnosis accuracy. Additionally, AI-assisted quality control systems aim to standardize the endoscopic examination process. This narrative review provides an overview of AI principles and discusses the current knowledge on AI-assisted endoscopy in the context of screening colonoscopies. It highlights the significant role of AI in improving lesion detection, characterization, and quality assurance during colonoscopy. However, further well-designed studies are needed to validate the clinical impact and cost-effectiveness of AI-assisted colonoscopy before its widespread implementation.
Collapse
Affiliation(s)
- Marco Spadaccini
- Department of Endoscopy, Humanitas Research Hospital, IRCCS, 20089 Rozzano, Italy; Department of Biomedical Sciences, Humanitas University, 20089 Rozzano, Italy.
| | - Joel Troya
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Würzburg, Germany
| | - Kareem Khalaf
- Division of Gastroenterology, St. Michael's Hospital, University of Toronto, Toronto, Canada
| | - Antonio Facciorusso
- Gastroenterology Unit, Department of Surgical and Medical Sciences, University of Foggia, Foggia, Italy
| | - Roberta Maselli
- Department of Endoscopy, Humanitas Research Hospital, IRCCS, 20089 Rozzano, Italy; Department of Biomedical Sciences, Humanitas University, 20089 Rozzano, Italy
| | - Alexander Hann
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Würzburg, Germany
| | - Alessandro Repici
- Department of Endoscopy, Humanitas Research Hospital, IRCCS, 20089 Rozzano, Italy; Department of Biomedical Sciences, Humanitas University, 20089 Rozzano, Italy
| |
Collapse
|
19
|
Mandarino FV, Danese S, Uraoka T, Parra-Blanco A, Maeda Y, Saito Y, Kudo SE, Bourke MJ, Iacucci M. Precision endoscopy in colorectal polyps' characterization and planning of endoscopic therapy. Dig Endosc 2024; 36:761-777. [PMID: 37988279 DOI: 10.1111/den.14727] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Accepted: 11/19/2023] [Indexed: 11/23/2023]
Abstract
Precision endoscopy in the management of colorectal polyps and early colorectal cancer has emerged as the standard of care. It includes optical characterization of polyps and estimation of submucosal invasion depth of large nonpedunculated colorectal polyps to select the appropriate endoscopic resection modality. Over time, several imaging modalities have been implemented in endoscopic practice to improve optical performance. Among these, image-enhanced endoscopy systems and magnification endoscopy represent now well-established tools. New advanced technologies, such as endocytoscopy and confocal laser endomicroscopy, have recently shown promising results in predicting the histology of colorectal polyps. In recent years, artificial intelligence has continued to enhance endoscopic performance in the characterization of colorectal polyps, overcoming the limitations of other imaging modes. In this review we retrace the path of precision endoscopy, analyzing the yield of various endoscopic imaging techniques in personalizing management of colorectal polyps and early colorectal cancer.
Collapse
Affiliation(s)
- Francesco Vito Mandarino
- Department of Gastroenterology and Gastrointestinal Endoscopy, San Raffaele Hospital IRCSS, Milan, Italy
- Department of Gastrointestinal Endoscopy, Westmead Hospital, Sydney, NSW, Australia
| | - Silvio Danese
- Department of Gastroenterology and Gastrointestinal Endoscopy, San Raffaele Hospital IRCSS, Milan, Italy
| | - Toshio Uraoka
- Department of Gastroenterology and Hepatology, Gunma University Graduate School of Medicine, Gumma, Japan
| | - Adolfo Parra-Blanco
- NIHR Nottingham Biomedical Research Centre, Nottingham University Hospitals NHS Trust and the University of Nottingham, Nottingham, UK
| | - Yasuharu Maeda
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yutaka Saito
- Endoscopy Division, National Cancer Center Hospital, Tokyo, Japan
| | - Shin-Ei Kudo
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Michael J Bourke
- Department of Gastrointestinal Endoscopy, Westmead Hospital, Sydney, NSW, Australia
| | - Marietta Iacucci
- Department of Gastroenterology, University College Cork, Cork, Ireland
| |
Collapse
|
20
|
Davila-Piñón P, Nogueira-Rodríguez A, Díez-Martín AI, Codesido L, Herrero J, Puga M, Rivas L, Sánchez E, Fdez-Riverola F, Glez-Peña D, Reboiro-Jato M, López-Fernández H, Cubiella J. Optical diagnosis in still images of colorectal polyps: comparison between expert endoscopists and PolyDeep, a Computer-Aided Diagnosis system. Front Oncol 2024; 14:1393815. [PMID: 38846970 PMCID: PMC11153726 DOI: 10.3389/fonc.2024.1393815] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Accepted: 04/22/2024] [Indexed: 06/09/2024] Open
Abstract
Background PolyDeep is a computer-aided detection and classification (CADe/x) system trained to detect and classify polyps. During colonoscopy, CADe/x systems help endoscopists to predict the histology of colonic lesions. Objective To compare the diagnostic performance of PolyDeep and expert endoscopists for the optical diagnosis of colorectal polyps on still images. Methods PolyDeep Image Classification (PIC) is an in vitro diagnostic test study. The PIC database contains NBI images of 491 colorectal polyps with histological diagnosis. We evaluated the diagnostic performance of PolyDeep and four expert endoscopists for neoplasia (adenoma, sessile serrated lesion, traditional serrated adenoma) and adenoma characterization and compared them with the McNemar test. Receiver operating characteristic curves were constructed to assess the overall discriminatory ability, comparing the area under the curve of endoscopists and PolyDeep with the chi- square homogeneity areas test. Results The diagnostic performance of the endoscopists and PolyDeep in the characterization of neoplasia is similar in terms of sensitivity (PolyDeep: 89.05%; E1: 91.23%, p=0.5; E2: 96.11%, p<0.001; E3: 86.65%, p=0.3; E4: 91.26% p=0.3) and specificity (PolyDeep: 35.53%; E1: 33.80%, p=0.8; E2: 34.72%, p=1; E3: 39.24%, p=0.8; E4: 46.84%, p=0.2). The overall discriminative ability also showed no statistically significant differences (PolyDeep: 0.623; E1: 0.625, p=0.8; E2: 0.654, p=0.2; E3: 0.629, p=0.9; E4: 0.690, p=0.09). In the optical diagnosis of adenomatous polyps, we found that PolyDeep had a significantly higher sensitivity and a significantly lower specificity. The overall discriminative ability of adenomatous lesions by expert endoscopists is significantly higher than PolyDeep (PolyDeep: 0.582; E1: 0.685, p < 0.001; E2: 0.677, p < 0.0001; E3: 0.658, p < 0.01; E4: 0.694, p < 0.0001). Conclusion PolyDeep and endoscopists have similar diagnostic performance in the optical diagnosis of neoplastic lesions. However, endoscopists have a better global discriminatory ability than PolyDeep in the optical diagnosis of adenomatous polyps.
Collapse
Affiliation(s)
- Pedro Davila-Piñón
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Alba Nogueira-Rodríguez
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Astrid Irene Díez-Martín
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Laura Codesido
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Jesús Herrero
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Manuel Puga
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Laura Rivas
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Eloy Sánchez
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Florentino Fdez-Riverola
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Daniel Glez-Peña
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Miguel Reboiro-Jato
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Hugo López-Fernández
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Joaquín Cubiella
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| |
Collapse
|
21
|
Siddiqui S, Akram T, Ashraf I, Raza M, Khan MA, Damaševičius R. CG‐Net: A novel CNN framework for gastrointestinal tract diseases classification. INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY 2024; 34. [DOI: 10.1002/ima.23081] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/30/2023] [Accepted: 03/31/2024] [Indexed: 09/23/2024]
Abstract
AbstractThe classification of medical images has had a significant influence on the diagnostic techniques and therapeutic interventions. Conventional disease diagnosis procedures require a substantial amount of time and effort to accurately diagnose. Based on global statistics, gastrointestinal cancer has been recognized as a major contributor to cancer‐related deaths. The complexities involved in resolving gastrointestinal tract (GIT) ailments arise from the need for elaborate methods to precisely identify the exact location of the problem. Therefore, doctors frequently use wireless capsule endoscopy to diagnose and treat GIT problems. This research aims to develop a robust framework using deep learning techniques to effectively classify GIT diseases for therapeutic purposes. A CNN based framework, in conjunction with the feature selection method, has been proposed to improve the classification rate. The proposed framework has been evaluated using various performance measures, including accuracy, recall, precision, F1 measure, mean absolute error, and mean squared error.
Collapse
Affiliation(s)
- Samra Siddiqui
- Department of Computer Science HITEC University Taxila Pakistan
- Department of Computer Science COMSATS University Islamabad Wah Campus Pakistan
| | - Tallha Akram
- Department of Information Systems, College of Computer Engineering and Sciences Prince Sattam bin Abdulaziz University Al‐Kharj Saudi Arabia
- Department of Machine Learning Convex Solutions Pvt (Ltd) Islamabad Pakistan
| | - Imran Ashraf
- Department of Computer Science, Department of Computer Science NUCES (FAST) Islamabad Pakistan
| | - Muddassar Raza
- Department of Computer Science HITEC University Taxila Pakistan
| | | | | |
Collapse
|
22
|
Jaspers TJM, Boers TGW, Kusters CHJ, Jong MR, Jukema JB, de Groof AJ, Bergman JJ, de With PHN, van der Sommen F. Robustness evaluation of deep neural networks for endoscopic image analysis: Insights and strategies. Med Image Anal 2024; 94:103157. [PMID: 38574544 DOI: 10.1016/j.media.2024.103157] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2023] [Revised: 03/19/2024] [Accepted: 03/21/2024] [Indexed: 04/06/2024]
Abstract
Computer-aided detection and diagnosis systems (CADe/CADx) in endoscopy are commonly trained using high-quality imagery, which is not representative for the heterogeneous input typically encountered in clinical practice. In endoscopy, the image quality heavily relies on both the skills and experience of the endoscopist and the specifications of the system used for screening. Factors such as poor illumination, motion blur, and specific post-processing settings can significantly alter the quality and general appearance of these images. This so-called domain gap between the data used for developing the system and the data it encounters after deployment, and the impact it has on the performance of deep neural networks (DNNs) supportive endoscopic CAD systems remains largely unexplored. As many of such systems, for e.g. polyp detection, are already being rolled out in clinical practice, this poses severe patient risks in particularly community hospitals, where both the imaging equipment and experience are subject to considerable variation. Therefore, this study aims to evaluate the impact of this domain gap on the clinical performance of CADe/CADx for various endoscopic applications. For this, we leverage two publicly available data sets (KVASIR-SEG and GIANA) and two in-house data sets. We investigate the performance of commonly-used DNN architectures under synthetic, clinically calibrated image degradations and on a prospectively collected dataset including 342 endoscopic images of lower subjective quality. Additionally, we assess the influence of DNN architecture and complexity, data augmentation, and pretraining techniques for improved robustness. The results reveal a considerable decline in performance of 11.6% (±1.5) as compared to the reference, within the clinically calibrated boundaries of image degradations. Nevertheless, employing more advanced DNN architectures and self-supervised in-domain pre-training effectively mitigate this drop to 7.7% (±2.03). Additionally, these enhancements yield the highest performance on the manually collected test set including images with lower subjective quality. By comprehensively assessing the robustness of popular DNN architectures and training strategies across multiple datasets, this study provides valuable insights into their performance and limitations for endoscopic applications. The findings highlight the importance of including robustness evaluation when developing DNNs for endoscopy applications and propose strategies to mitigate performance loss.
Collapse
Affiliation(s)
- Tim J M Jaspers
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands.
| | - Tim G W Boers
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Carolus H J Kusters
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Martijn R Jong
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Centers, University of Amsterdam, Amsterdam, The Netherlands
| | - Jelmer B Jukema
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Centers, University of Amsterdam, Amsterdam, The Netherlands
| | - Albert J de Groof
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Centers, University of Amsterdam, Amsterdam, The Netherlands
| | - Jacques J Bergman
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Centers, University of Amsterdam, Amsterdam, The Netherlands
| | - Peter H N de With
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Fons van der Sommen
- Department of Electrical Engineering, Video Coding & Architectures, Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
23
|
Rogers MP, Janjua HM, Walczak S, Baker M, Read M, Cios K, Velanovich V, Pietrobon R, Kuo PC. Artificial Intelligence in Surgical Research: Accomplishments and Future Directions. Am J Surg 2024; 230:82-90. [PMID: 37981516 DOI: 10.1016/j.amjsurg.2023.10.045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2023] [Accepted: 10/22/2023] [Indexed: 11/21/2023]
Abstract
MINI-ABSTRACT The study introduces various methods of performing conventional ML and their implementation in surgical areas, and the need to move beyond these traditional approaches given the advent of big data. OBJECTIVE Investigate current understanding and future directions of machine learning applications, such as risk stratification, clinical data analytics, and decision support, in surgical practice. SUMMARY BACKGROUND DATA The advent of the electronic health record, near unlimited computing, and open-source computational packages have created an environment for applying artificial intelligence, machine learning, and predictive analytic techniques to healthcare. The "hype" phase has passed, and algorithmic approaches are being developed for surgery patients through all stages of care, involving preoperative, intraoperative, and postoperative components. Surgeons must understand and critically evaluate the strengths and weaknesses of these methodologies. METHODS The current body of AI literature was reviewed, emphasizing on contemporary approaches important in the surgical realm. RESULTS AND CONCLUSIONS The unrealized impacts of AI on clinical surgery and its subspecialties are immense. As this technology continues to pervade surgical literature and clinical applications, knowledge of its inner workings and shortcomings is paramount in determining its appropriate implementation.
Collapse
Affiliation(s)
- Michael P Rogers
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA
| | - Haroon M Janjua
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA
| | - Steven Walczak
- School of Information & Florida Center for Cybersecurity, University of South Florida, Tampa, FL, USA
| | - Marshall Baker
- Department of Surgery, Loyola University Medical Center, Maywood, IL, USA
| | - Meagan Read
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA
| | - Konrad Cios
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA
| | - Vic Velanovich
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA
| | | | - Paul C Kuo
- Department of Surgery, University of South Florida Morsani College of Medicine, Tampa, FL, USA.
| |
Collapse
|
24
|
Yokote A, Umeno J, Kawasaki K, Fujioka S, Fuyuno Y, Matsuno Y, Yoshida Y, Imazu N, Miyazono S, Moriyama T, Kitazono T, Torisu T. Small bowel capsule endoscopy examination and open access database with artificial intelligence: The SEE-artificial intelligence project. DEN OPEN 2024; 4:e258. [PMID: 37359150 PMCID: PMC10288072 DOI: 10.1002/deo2.258] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Revised: 05/31/2023] [Accepted: 06/05/2023] [Indexed: 06/28/2023]
Abstract
OBJECTIVES Artificial intelligence (AI) may be practical for image classification of small bowel capsule endoscopy (CE). However, creating a functional AI model is challenging. We attempted to create a dataset and an object detection CE AI model to explore modeling problems to assist in reading small bowel CE. METHODS We extracted 18,481 images from 523 small bowel CE procedures performed at Kyushu University Hospital from September 2014 to June 2021. We annotated 12,320 images with 23,033 disease lesions, combined them with 6161 normal images as the dataset, and examined the characteristics. Based on the dataset, we created an object detection AI model using YOLO v5 and we tested validation. RESULTS We annotated the dataset with 12 types of annotations, and multiple annotation types were observed in the same image. We test validated our AI model with 1396 images, and sensitivity for all 12 types of annotations was about 91%, with 1375 true positives, 659 false positives, and 120 false negatives detected. The highest sensitivity for individual annotations was 97%, and the highest area under the receiver operating characteristic curve was 0.98, but the quality of detection varied depending on the specific annotation. CONCLUSIONS Object detection AI model in small bowel CE using YOLO v5 may provide effective and easy-to-understand reading assistance. In this SEE-AI project, we open our dataset, the weights of the AI model, and a demonstration to experience our AI. We look forward to further improving the AI model in the future.
Collapse
Affiliation(s)
- Akihito Yokote
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Junji Umeno
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Keisuke Kawasaki
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Shin Fujioka
- Department of Endoscopic Diagnostics and Therapeutics Kyushu University Hospital Fukuoka Japan
| | - Yuta Fuyuno
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Yuichi Matsuno
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Yuichiro Yoshida
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Noriyuki Imazu
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Satoshi Miyazono
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Tomohiko Moriyama
- International Medical Department Kyushu University Hospital Fukuoka Japan
| | - Takanari Kitazono
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| | - Takehiro Torisu
- Department of Medicine and Clinical Science Graduate School of Medical Science Kyushu University Fukuoka Japan
| |
Collapse
|
25
|
Xu J, Kuai Y, Chen Q, Wang X, Zhao Y, Sun B. Spatio-Temporal Feature Transformation Based Polyp Recognition for Automatic Detection: Higher Accuracy than Novice Endoscopists in Colorectal Polyp Detection and Diagnosis. Dig Dis Sci 2024; 69:911-921. [PMID: 38244123 PMCID: PMC10960915 DOI: 10.1007/s10620-024-08277-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Accepted: 01/03/2024] [Indexed: 01/22/2024]
Abstract
BACKGROUND Artificial intelligence represents an emerging area with promising potential for improving colonoscopy quality. AIMS To develop a colon polyp detection model using STFT and evaluate its performance through a randomized sample experiment. METHODS Colonoscopy videos from the Digestive Endoscopy Center of the First Affiliated Hospital of Anhui Medical University, recorded between January 2018 and November 2022, were selected and divided into two datasets. To verify the model's practical application in clinical settings, 1500 colonoscopy images and 1200 polyp images of various sizes were randomly selected from the test set and compared with the STFT model's and endoscopists' recognition results with different years of experience. RESULTS In the randomized sample trial involving 1500 colonoscopy images, the STFT model demonstrated significantly higher accuracy and specificity compared to endoscopists with low years of experience (0.902 vs. 0.809, 0.898 vs. 0.826, respectively). Moreover, the model's sensitivity was 0.904, which was higher than that of endoscopists with low, medium, or high years of experience (0.80, 0.896, 0.895, respectively), with statistical significance (P < 0.05). In the randomized sample experiment of 1200 polyp images of different sizes, the accuracy of the STFT model was significantly higher than that of endoscopists with low years of experience when the polyp size was ≤ 0.5 cm and 0.6-1.0 cm (0.902 vs. 0.70, 0.953 vs. 0.865, respectively). CONCLUSIONS The STFT-based colon polyp detection model exhibits high accuracy in detecting polyps in colonoscopy videos, with a particular efficiency in detecting small polyps (≤ 0.5 cm)(0.902 vs. 0.70, P < 0.001).
Collapse
Affiliation(s)
- Jianhua Xu
- Anhui Medical University, Hefei, Anhui, 230032, China
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
| | - Yaxian Kuai
- Anhui Medical University, Hefei, Anhui, 230032, China
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
| | - Qianqian Chen
- Anhui Medical University, Hefei, Anhui, 230032, China
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
| | - Xu Wang
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
- Anhui Provincial Key Laboratory of Digestive Disease, The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
| | - Yihang Zhao
- Anhui Medical University, Hefei, Anhui, 230032, China
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China
| | - Bin Sun
- The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China.
- Anhui Provincial Key Laboratory of Digestive Disease, The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui, 230022, China.
- Department of Gastroenterology, The First Affiliated Hospital of Anhui Medical University, Jixi Road 218, Hefei, Anhui, 230022, China.
| |
Collapse
|
26
|
Kato S, Kudo SE, Minegishi Y, Miyata Y, Maeda Y, Kuroki T, Takashina Y, Mochizuki K, Tamura E, Abe M, Sato Y, Sakurai T, Kouyama Y, Tanaka K, Ogawa Y, Nakamura H, Ichimasa K, Ogata N, Hisayuki T, Hayashi T, Wakamura K, Miyachi H, Baba T, Ishida F, Nemoto T, Misawa M. Impact of computer-aided characterization for diagnosis of colorectal lesions, including sessile serrated lesions: Multireader, multicase study. Dig Endosc 2024; 36:341-350. [PMID: 37937532 DOI: 10.1111/den.14612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Accepted: 06/06/2023] [Indexed: 11/09/2023]
Abstract
OBJECTIVES Computer-aided characterization (CADx) may be used to implement optical biopsy strategies into colonoscopy practice; however, its impact on endoscopic diagnosis remains unknown. We aimed to evaluate the additional diagnostic value of CADx when used by endoscopists for assessing colorectal polyps. METHODS This was a single-center, multicase, multireader, image-reading study using randomly extracted images of pathologically confirmed polyps resected between July 2021 and January 2022. Approved CADx that could predict two-tier classification (neoplastic or nonneoplastic) by analyzing narrow-band images of the polyps was used to obtain a CADx diagnosis. Participating endoscopists determined if the polyps were neoplastic or not and noted their confidence level using a computer-based, image-reading test. The test was conducted twice with a 4-week interval: the first test was conducted without CADx prediction and the second test with CADx prediction. Diagnostic performances for neoplasms were calculated using the pathological diagnosis as reference and performances with and without CADx prediction were compared. RESULTS Five hundred polyps were randomly extracted from 385 patients and diagnosed by 14 endoscopists (including seven experts). The sensitivity for neoplasia was significantly improved by referring to CADx (89.4% vs. 95.6%). CADx also had incremental effects on the negative predictive value (69.3% vs. 84.3%), overall accuracy (87.2% vs. 91.8%), and high-confidence diagnosis rate (77.4% vs. 85.8%). However, there was no significant difference in specificity (80.1% vs. 78.9%). CONCLUSIONS Computer-aided characterization has added diagnostic value for differentiating colorectal neoplasms and may improve the high-confidence diagnosis rate.
Collapse
Affiliation(s)
- Shun Kato
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Shin-Ei Kudo
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yosuke Minegishi
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yuki Miyata
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yasuharu Maeda
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Takanori Kuroki
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yuki Takashina
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Kenichi Mochizuki
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Eri Tamura
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Masahiro Abe
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yuta Sato
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Tatsuya Sakurai
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yuta Kouyama
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Kenta Tanaka
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Yushi Ogawa
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Hiroki Nakamura
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Katsuro Ichimasa
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
- Department of Gastroenterology and Hepatology, National University Hospital, Singapore City, Singapore
| | - Noriyuki Ogata
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Tomokazu Hisayuki
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Takemasa Hayashi
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Kunihiko Wakamura
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Hideyuki Miyachi
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Toshiyuki Baba
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Fumio Ishida
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Tetsuo Nemoto
- Department of Diagnostic Pathology and Laboratory Medicine, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| | - Masashi Misawa
- Digestive Disease Center, Showa University Northern Yokohama Hospital, Kanagawa, Japan
| |
Collapse
|
27
|
Cobanaj M, Corti C, Dee EC, McCullum L, Boldrini L, Schlam I, Tolaney SM, Celi LA, Curigliano G, Criscitiello C. Advancing equitable and personalized cancer care: Novel applications and priorities of artificial intelligence for fairness and inclusivity in the patient care workflow. Eur J Cancer 2024; 198:113504. [PMID: 38141549 PMCID: PMC11362966 DOI: 10.1016/j.ejca.2023.113504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Accepted: 12/13/2023] [Indexed: 12/25/2023]
Abstract
Patient care workflows are highly multimodal and intertwined: the intersection of data outputs provided from different disciplines and in different formats remains one of the main challenges of modern oncology. Artificial Intelligence (AI) has the potential to revolutionize the current clinical practice of oncology owing to advancements in digitalization, database expansion, computational technologies, and algorithmic innovations that facilitate discernment of complex relationships in multimodal data. Within oncology, radiation therapy (RT) represents an increasingly complex working procedure, involving many labor-intensive and operator-dependent tasks. In this context, AI has gained momentum as a powerful tool to standardize treatment performance and reduce inter-observer variability in a time-efficient manner. This review explores the hurdles associated with the development, implementation, and maintenance of AI platforms and highlights current measures in place to address them. In examining AI's role in oncology workflows, we underscore that a thorough and critical consideration of these challenges is the only way to ensure equitable and unbiased care delivery, ultimately serving patients' survival and quality of life.
Collapse
Affiliation(s)
- Marisa Cobanaj
- National Center for Radiation Research in Oncology, OncoRay, Helmholtz-Zentrum Dresden-Rossendorf, Dresden, Germany
| | - Chiara Corti
- Breast Oncology Program, Dana-Farber Brigham Cancer Center, Boston, MA, USA; Harvard Medical School, Boston, MA, USA; Division of New Drugs and Early Drug Development for Innovative Therapies, European Institute of Oncology, IRCCS, Milan, Italy; Department of Oncology and Hematology-Oncology (DIPO), University of Milan, Milan, Italy.
| | - Edward C Dee
- Department of Radiation Oncology, Memorial Sloan Kettering Cancer Center, New York, NY, USA
| | - Lucas McCullum
- Department of Radiation Oncology, MD Anderson Cancer Center, Houston, TX, USA
| | - Laura Boldrini
- Division of New Drugs and Early Drug Development for Innovative Therapies, European Institute of Oncology, IRCCS, Milan, Italy; Department of Oncology and Hematology-Oncology (DIPO), University of Milan, Milan, Italy
| | - Ilana Schlam
- Department of Hematology and Oncology, Tufts Medical Center, Boston, MA, USA; Harvard T.H. Chan School of Public Health, Boston, MA, USA
| | - Sara M Tolaney
- Breast Oncology Program, Dana-Farber Brigham Cancer Center, Boston, MA, USA; Harvard Medical School, Boston, MA, USA; Department of Medical Oncology, Dana-Farber Cancer Institute, Boston, MA, USA
| | - Leo A Celi
- Department of Medicine, Beth Israel Deaconess Medical Center, Boston, MA, USA; Laboratory for Computational Physiology, Massachusetts Institute of Technology, Cambridge, MA, USA; Department of Biostatistics, Harvard T.H. Chan School of Public Health, Boston, MA, USA
| | - Giuseppe Curigliano
- Division of New Drugs and Early Drug Development for Innovative Therapies, European Institute of Oncology, IRCCS, Milan, Italy; Department of Oncology and Hematology-Oncology (DIPO), University of Milan, Milan, Italy
| | - Carmen Criscitiello
- Division of New Drugs and Early Drug Development for Innovative Therapies, European Institute of Oncology, IRCCS, Milan, Italy; Department of Oncology and Hematology-Oncology (DIPO), University of Milan, Milan, Italy
| |
Collapse
|
28
|
Kim BS, Cho M, Chung GE, Lee J, Kang HY, Yoon D, Cho WS, Lee JC, Bae JH, Kong HJ, Kim S. Density clustering-based automatic anatomical section recognition in colonoscopy video using deep learning. Sci Rep 2024; 14:872. [PMID: 38195632 PMCID: PMC10776865 DOI: 10.1038/s41598-023-51056-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2023] [Accepted: 12/29/2023] [Indexed: 01/11/2024] Open
Abstract
Recognizing anatomical sections during colonoscopy is crucial for diagnosing colonic diseases and generating accurate reports. While recent studies have endeavored to identify anatomical regions of the colon using deep learning, the deformable anatomical characteristics of the colon pose challenges for establishing a reliable localization system. This study presents a system utilizing 100 colonoscopy videos, combining density clustering and deep learning. Cascaded CNN models are employed to estimate the appendix orifice (AO), flexures, and "outside of the body," sequentially. Subsequently, DBSCAN algorithm is applied to identify anatomical sections. Clustering-based analysis integrates clinical knowledge and context based on the anatomical section within the model. We address challenges posed by colonoscopy images through non-informative removal preprocessing. The image data is labeled by clinicians, and the system deduces section correspondence stochastically. The model categorizes the colon into three sections: right (cecum and ascending colon), middle (transverse colon), and left (descending colon, sigmoid colon, rectum). We estimated the appearance time of anatomical boundaries with an average error of 6.31 s for AO, 9.79 s for HF, 27.69 s for SF, and 3.26 s for outside of the body. The proposed method can facilitate future advancements towards AI-based automatic reporting, offering time-saving efficacy and standardization.
Collapse
Grants
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- 1711179421, RS-2021-KD000006 the Korea Medical Device Development Fund grant funded by the Korean government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health and Welfare, and the Ministry of Food and Drug Safety)
- IITP-2023-2018-0-01833 the Ministry of Science and ICT, Korea under the Information Technology Research Center (ITRC) support program
Collapse
Affiliation(s)
- Byeong Soo Kim
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Minwoo Cho
- Innovative Medical Technology Research Institute, Seoul National University Hospital, Seoul, 03080, Korea
- Department of Transdisciplinary Medicine, Seoul National University Hospital, Seoul, 03080, Korea
- Department of Medicine, Seoul National University College of Medicine, Seoul, 03080, Korea
| | - Goh Eun Chung
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Jooyoung Lee
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Hae Yeon Kang
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea
| | - Dan Yoon
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Woo Sang Cho
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, 08826, Korea
| | - Jung Chan Lee
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, 03080, Korea
- Institute of Bioengineering, Seoul National University, Seoul, 08826, Republic of Korea
- Institute of Medical and Biological Engineering, Medical Research Center, Seoul National University, Seoul, 03080, Korea
| | - Jung Ho Bae
- Department of Internal Medicine and Healthcare Research Institute, Healthcare System Gangnam Center, Seoul National University Hospital, Seoul, 06236, Korea.
| | - Hyoun-Joong Kong
- Innovative Medical Technology Research Institute, Seoul National University Hospital, Seoul, 03080, Korea.
- Department of Transdisciplinary Medicine, Seoul National University Hospital, Seoul, 03080, Korea.
- Department of Medicine, Seoul National University College of Medicine, Seoul, 03080, Korea.
- Medical Big Data Research Center, Seoul National University College of Medicine, Seoul, 03087, Korea.
| | - Sungwan Kim
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, 03080, Korea.
- Institute of Bioengineering, Seoul National University, Seoul, 08826, Republic of Korea.
- Artificial Intelligence Institute, Seoul National University, Research Park Building 942, 2 Fl., Seoul, 08826, Korea.
| |
Collapse
|
29
|
Yu D, Zhang J, Li X, Xiao S, Xing J, Li J. Developing the novel diagnostic model and potential drugs by integrating bioinformatics and machine learning for aldosterone-producing adenomas. Front Mol Biosci 2024; 10:1308754. [PMID: 38239411 PMCID: PMC10794617 DOI: 10.3389/fmolb.2023.1308754] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2023] [Accepted: 12/08/2023] [Indexed: 01/22/2024] Open
Abstract
Background: Aldosterone-producing adenomas (APA) are a common cause of primary aldosteronism (PA), a clinical syndrome characterized by hypertension and electrolyte disturbances. If untreated, it may lead to serious cardiovascular complications. Therefore, there is an urgent need for potential biomarkers and targeted drugs for the diagnosis and treatment of aldosteronism. Methods: We downloaded two datasets (GSE156931 and GSE60042) from the GEO database and merged them by de-batch effect, then screened the top50 of differential genes using PPI and enriched them, followed by screening the Aldosterone adenoma-related genes (ARGs) in the top50 using three machine learning algorithms. We performed GSEA analysis on the ARGs separately and constructed artificial neural networks based on the ARGs. Finally, the Enrich platform was utilized to identify drugs with potential therapeutic effects on APA by tARGseting the ARGs. Results: We identified 190 differential genes by differential analysis, and then identified the top50 genes by PPI, and the enrichment analysis showed that they were mainly enriched in amino acid metabolic pathways. Then three machine learning algorithms identified five ARGs, namely, SST, RAB3C, PPY, CYP3A4, CDH10, and the ANN constructed on the basis of these five ARGs had better diagnostic effect on APA, in which the AUC of the training set is 1 and the AUC of the validation set is 0.755. And then the Enrich platform identified drugs tARGseting the ARGs with potential therapeutic effects on APA. Conclusion: We identified five ARGs for APA through bioinformatic analysis and constructed Artificial neural network (ANN) based on them with better diagnostic effects, and identified drugs with potential therapeutic effects on APA by tARGseting these ARGs. Our study provides more options for the diagnosis and treatment of APA.
Collapse
Affiliation(s)
- Deshui Yu
- Department of Urology, Air Force Medical Center, Beijing, China
- China Medical University, Shenyang, China
| | - Jinxuan Zhang
- Department of Urology, Air Force Medical Center, Beijing, China
- China Medical University, Shenyang, China
| | - Xintao Li
- Department of Urology, Air Force Medical Center, Beijing, China
| | - Shuwei Xiao
- Department of Urology, Air Force Medical Center, Beijing, China
| | - Jizhang Xing
- Department of Urology, Air Force Medical Center, Beijing, China
| | - Jianye Li
- Department of Urology, Air Force Medical Center, Beijing, China
- China Medical University, Shenyang, China
| |
Collapse
|
30
|
Xia J, Jiang B, Pan J, Liao Z. Imaging of gastrointestinal endoscopy. TRANSPATHOLOGY 2024:171-183. [DOI: 10.1016/b978-0-323-95223-1.00026-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2025]
|
31
|
Xin Y, Zhang Q, Liu X, Li B, Mao T, Li X. Application of artificial intelligence in endoscopic gastrointestinal tumors. Front Oncol 2023; 13:1239788. [PMID: 38144533 PMCID: PMC10747923 DOI: 10.3389/fonc.2023.1239788] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Accepted: 11/17/2023] [Indexed: 12/26/2023] Open
Abstract
With an increasing number of patients with gastrointestinal cancer, effective and accurate early diagnostic clinical tools are required provide better health care for patients with gastrointestinal cancer. Recent studies have shown that artificial intelligence (AI) plays an important role in the diagnosis and treatment of patients with gastrointestinal tumors, which not only improves the efficiency of early tumor screening, but also significantly improves the survival rate of patients after treatment. With the aid of efficient learning and judgment abilities of AI, endoscopists can improve the accuracy of diagnosis and treatment through endoscopy and avoid incorrect descriptions or judgments of gastrointestinal lesions. The present article provides an overview of the application status of various artificial intelligence in gastric and colorectal cancers in recent years, and the direction of future research and clinical practice is clarified from a clinical perspective to provide a comprehensive theoretical basis for AI as a promising diagnostic and therapeutic tool for gastrointestinal cancer.
Collapse
Affiliation(s)
| | | | | | | | | | - Xiaoyu Li
- Department of Gastroenterology, The Affiliated Hospital of Qingdao University, Qingdao, China
| |
Collapse
|
32
|
Hsu WF, Chiu HM. Optimization of colonoscopy quality: Comprehensive review of the literature and future perspectives. Dig Endosc 2023; 35:822-834. [PMID: 37381701 DOI: 10.1111/den.14627] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/14/2023] [Accepted: 06/27/2023] [Indexed: 06/30/2023]
Abstract
Colonoscopy is crucial in preventing colorectal cancer (CRC) and reducing associated mortality. This comprehensive review examines the importance of high-quality colonoscopy and associated quality indicators, including bowel preparation, cecal intubation rate, withdrawal time, adenoma detection rate (ADR), complete resection, specimen retrieval, complication rates, and patient satisfaction, while also discussing other ADR-related metrics. Additionally, the review draws attention to often overlooked quality aspects, such as nonpolypoid lesion detection, as well as insertion and withdrawal skills. Moreover, it explores the potential of artificial intelligence in enhancing colonoscopy quality and highlights specific considerations for organized screening programs. The review also emphasizes the implications of organized screening programs and the need for continuous quality improvement. A high-quality colonoscopy is crucial for preventing postcolonoscopy CRC- and CRC-related deaths. Health-care professionals must develop a thorough understanding of colonoscopy quality components, including technical quality, patient safety, and patient experience. By prioritizing ongoing evaluation and refinement of these quality indicators, health-care providers can contribute to improved patient outcomes and develop more effective CRC screening programs.
Collapse
Affiliation(s)
- Wen-Feng Hsu
- Department of Internal Medicine, National Taiwan University Hospital, Taipei, Taiwan
| | - Han-Mo Chiu
- Department of Internal Medicine, National Taiwan University Hospital, Taipei, Taiwan
| |
Collapse
|
33
|
Ding M, Yan J, Chao G, Zhang S. Application of artificial intelligence in colorectal cancer screening by colonoscopy: Future prospects (Review). Oncol Rep 2023; 50:199. [PMID: 37772392 DOI: 10.3892/or.2023.8636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Accepted: 07/07/2023] [Indexed: 09/30/2023] Open
Abstract
Colorectal cancer (CRC) has become a severe global health concern, with the third‑high incidence and second‑high mortality rate of all cancers. The burden of CRC is expected to surge to 60% by 2030. Fortunately, effective early evidence‑based screening could significantly reduce the incidence and mortality of CRC. Colonoscopy is the core screening method for CRC with high popularity and accuracy. Yet, the accuracy of colonoscopy in CRC screening is related to the experience and state of operating physicians. It is challenging to maintain the high CRC diagnostic rate of colonoscopy. Artificial intelligence (AI)‑assisted colonoscopy will compensate for the above shortcomings and improve the accuracy, efficiency, and quality of colonoscopy screening. The unique advantages of AI, such as the continuous advancement of high‑performance computing capabilities and innovative deep‑learning architectures, which hugely impact the control of colorectal cancer morbidity and mortality expectancy, highlight its role in colonoscopy screening.
Collapse
Affiliation(s)
- Menglu Ding
- The Second Affiliated Hospital of Zhejiang Chinese Medical University (The Xin Hua Hospital of Zhejiang Province), Hangzhou, Zhejiang 310000, P.R. China
| | - Junbin Yan
- The Second Affiliated Hospital of Zhejiang Chinese Medical University (The Xin Hua Hospital of Zhejiang Province), Hangzhou, Zhejiang 310000, P.R. China
| | - Guanqun Chao
- Department of General Practice, Sir Run Run Shaw Hospital, Zhejiang University, Hangzhou, Zhejiang 310000, P.R. China
| | - Shuo Zhang
- The Second Affiliated Hospital of Zhejiang Chinese Medical University (The Xin Hua Hospital of Zhejiang Province), Hangzhou, Zhejiang 310000, P.R. China
| |
Collapse
|
34
|
Kim MJ, Kim SH, Kim SM, Nam JH, Hwang YB, Lim YJ. The Advent of Domain Adaptation into Artificial Intelligence for Gastrointestinal Endoscopy and Medical Imaging. Diagnostics (Basel) 2023; 13:3023. [PMID: 37835766 PMCID: PMC10572560 DOI: 10.3390/diagnostics13193023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2023] [Revised: 09/01/2023] [Accepted: 09/12/2023] [Indexed: 10/15/2023] Open
Abstract
Artificial intelligence (AI) is a subfield of computer science that aims to implement computer systems that perform tasks that generally require human learning, reasoning, and perceptual abilities. AI is widely used in the medical field. The interpretation of medical images requires considerable effort, time, and skill. AI-aided interpretations, such as automated abnormal lesion detection and image classification, are promising areas of AI. However, when images with different characteristics are extracted, depending on the manufacturer and imaging environment, a so-called domain shift problem occurs in which the developed AI has a poor versatility. Domain adaptation is used to address this problem. Domain adaptation is a tool that generates a newly converted image which is suitable for other domains. It has also shown promise in reducing the differences in appearance among the images collected from different devices. Domain adaptation is expected to improve the reading accuracy of AI for heterogeneous image distributions in gastrointestinal (GI) endoscopy and medical image analyses. In this paper, we review the history and basic characteristics of domain shift and domain adaptation. We also address their use in gastrointestinal endoscopy and the medical field more generally through published examples, perspectives, and future directions.
Collapse
Affiliation(s)
- Min Ji Kim
- Division of Gastroenterology, Department of Internal Medicine, Dongguk University Ilsan Hospital, Dongguk University College of Medicine, Goyang 10326, Republic of Korea; (M.J.K.); (S.H.K.); (J.H.N.)
| | - Sang Hoon Kim
- Division of Gastroenterology, Department of Internal Medicine, Dongguk University Ilsan Hospital, Dongguk University College of Medicine, Goyang 10326, Republic of Korea; (M.J.K.); (S.H.K.); (J.H.N.)
| | - Suk Min Kim
- Department of Intelligent Systems and Robotics, College of Electrical & Computer Engineering, Chungbuk National University, Cheongju 28644, Republic of Korea; (S.M.K.); (Y.B.H.)
| | - Ji Hyung Nam
- Division of Gastroenterology, Department of Internal Medicine, Dongguk University Ilsan Hospital, Dongguk University College of Medicine, Goyang 10326, Republic of Korea; (M.J.K.); (S.H.K.); (J.H.N.)
| | - Young Bae Hwang
- Department of Intelligent Systems and Robotics, College of Electrical & Computer Engineering, Chungbuk National University, Cheongju 28644, Republic of Korea; (S.M.K.); (Y.B.H.)
| | - Yun Jeong Lim
- Division of Gastroenterology, Department of Internal Medicine, Dongguk University Ilsan Hospital, Dongguk University College of Medicine, Goyang 10326, Republic of Korea; (M.J.K.); (S.H.K.); (J.H.N.)
| |
Collapse
|
35
|
Arif AA, Jiang SX, Byrne MF. Artificial intelligence in endoscopy: Overview, applications, and future directions. Saudi J Gastroenterol 2023; 29:269-277. [PMID: 37787347 PMCID: PMC10644999 DOI: 10.4103/sjg.sjg_286_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Accepted: 08/16/2023] [Indexed: 09/15/2023] Open
Abstract
Since the emergence of artificial intelligence (AI) in medicine, endoscopy applications in gastroenterology have been at the forefront of innovations. The ever-increasing number of studies necessitates the need to organize and classify applications in a useful way. Separating AI capabilities by computer aided detection (CADe), diagnosis (CADx), and quality assessment (CADq) allows for a systematic evaluation of each application. CADe studies have shown promise in accurate detection of esophageal, gastric and colonic neoplasia as well as identifying sources of bleeding and Crohn's disease in the small bowel. While more advanced CADx applications employ optical biopsies to give further information to characterize neoplasia and grade inflammatory disease, diverse CADq applications ensure quality and increase the efficiency of procedures. Future applications show promise in advanced therapeutic modalities and integrated systems that provide multimodal capabilities. AI is set to revolutionize clinical decision making and performance of endoscopy.
Collapse
Affiliation(s)
- Arif A. Arif
- Department of Medicine, University of British Columbia, Vancouver, BC, Canada
| | - Shirley X. Jiang
- Department of Medicine, University of British Columbia, Vancouver, BC, Canada
| | - Michael F. Byrne
- Division of Gastroenterology, Department of Medicine, University of British Columbia, Vancouver, BC, Canada
- Satisfai Health, Vancouver, BC, Canada
| |
Collapse
|
36
|
Houwen BBSL, Hazewinkel Y, Giotis I, Vleugels JLA, Mostafavi NS, van Putten P, Fockens P, Dekker E. Computer-aided diagnosis for optical diagnosis of diminutive colorectal polyps including sessile serrated lesions: a real-time comparison with screening endoscopists. Endoscopy 2023; 55:756-765. [PMID: 36623839 PMCID: PMC10374350 DOI: 10.1055/a-2009-3990] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Accepted: 01/09/2023] [Indexed: 01/11/2023]
Abstract
BACKGROUND : We aimed to compare the accuracy of the optical diagnosis of diminutive colorectal polyps, including sessile serrated lesions (SSLs), between a computer-aided diagnosis (CADx) system and endoscopists during real-time colonoscopy. METHODS : We developed the POLyp Artificial Recognition (POLAR) system, which was capable of performing real-time characterization of diminutive colorectal polyps. For pretraining, the Microsoft-COCO dataset with over 300 000 nonpolyp object images was used. For training, eight hospitals prospectively collected 2637 annotated images from 1339 polyps (i. e. publicly available online POLAR database). For clinical validation, POLAR was tested during colonoscopy in patients with a positive fecal immunochemical test (FIT), and compared with the performance of 20 endoscopists from eight hospitals. Endoscopists were blinded to the POLAR output. Primary outcome was the comparison of accuracy of the optical diagnosis of diminutive colorectal polyps between POLAR and endoscopists (neoplastic [adenomas and SSLs] versus non-neoplastic [hyperplastic polyps]). Histopathology served as the reference standard. RESULTS : During clinical validation, 423 diminutive polyps detected in 194 FIT-positive individuals were included for analysis (300 adenomas, 41 SSLs, 82 hyperplastic polyps). POLAR distinguished neoplastic from non-neoplastic lesions with 79 % accuracy, 89 % sensitivity, and 38 % specificity. The endoscopists achieved 83 % accuracy, 92 % sensitivity, and 44 % specificity. The optical diagnosis accuracy between POLAR and endoscopists was not significantly different (P = 0.10). The proportion of polyps in which POLAR was able to provide an optical diagnosis was 98 % (i. e. success rate). CONCLUSIONS : We developed a CADx system that differentiated neoplastic from non-neoplastic diminutive polyps during endoscopy, with an accuracy comparable to that of screening endoscopists and near-perfect success rate.
Collapse
Affiliation(s)
- Britt B. S. L. Houwen
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Center, Amsterdam, the Netherlands
| | - Yark Hazewinkel
- Department of Gastroenterology and Hepatology, Radboud University Nijmegen Medical Center, Radboud University of Nijmegen, Nijmegen, The Netherlands
| | | | - Jasper L. A. Vleugels
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Center, Amsterdam, the Netherlands
| | - Nahid S. Mostafavi
- Department of Gastroenterology and Hepatology, Subdivision Statistics, Amsterdam University Medical Center, Academic Medical Center, University of Amsterdam, Amsterdam, the Netherlands
| | - Paul van Putten
- Department of Gastroenterology and Hepatology, Medical Center Leeuwarden, Leeuwarden, The Netherlands
| | - Paul Fockens
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Center, Amsterdam, the Netherlands
| | - Evelien Dekker
- Department of Gastroenterology and Hepatology, Amsterdam University Medical Center, Amsterdam, the Netherlands
- Bergman Clinics Maag and Darm Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
37
|
Jahagirdar V, Bapaye J, Chandan S, Ponnada S, Kochhar GS, Navaneethan U, Mohan BP. Diagnostic accuracy of convolutional neural network-based machine learning algorithms in endoscopic severity prediction of ulcerative colitis: a systematic review and meta-analysis. Gastrointest Endosc 2023; 98:145-154.e8. [PMID: 37094691 DOI: 10.1016/j.gie.2023.04.2074] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Revised: 03/06/2023] [Accepted: 04/16/2023] [Indexed: 04/26/2023]
Abstract
BACKGROUND AND AIMS Endoscopic assessment of ulcerative colitis (UC) can be performed by using the Mayo Endoscopic Score (MES) or the Ulcerative Colitis Endoscopic Index of Severity (UCEIS). In this meta-analysis, we assessed the pooled diagnostic accuracy parameters of deep machine learning by means of convolutional neural network (CNN) algorithms in predicting UC severity on endoscopic images. METHODS Databases including MEDLINE, Scopus, and Embase were searched in June 2022. Outcomes of interest were the pooled accuracy, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV). Standard meta-analysis methods used the random-effects model, and heterogeneity was assessed using the I2statistics. RESULTS Twelve studies were included in the final analysis. The pooled diagnostic parameters of CNN-based machine learning algorithms in endoscopic severity assessment of UC were as follows: accuracy 91.5% (95% confidence interval [CI], 88.3-93.8; I2 = 84%), sensitivity 82.8% (95% CI, 78.3-86.5; I2 = 89%), specificity 92.4% (95% CI, 89.4-94.6; I2 = 84%), PPV 86.6% (95% CI, 82.3-90; I2 = 89%), and NPV 88.6% (95% CI, 85.7-91; I2 = 78%). Subgroup analysis revealed significantly better sensitivity and PPV with the UCEIS scoring system compared with the MES (93.6% [95% CI, 87.5-96.8; I2 = 77%] vs 82% [95% CI, 75.6-87; I2 = 89%], P = .003, and 93.6% [95% CI, 88.7-96.4; I2 = 68%] vs 83.6% [95% CI, 76.8-88.8; I2 = 77%], P = .007, respectively). CONCLUSIONS CNN-based machine learning algorithms demonstrated excellent pooled diagnostic accuracy parameters in the endoscopic severity assessment of UC. Using UCEIS scores in CNN training might offer better results than the MES. Further studies are warranted to establish these findings in real clinical settings.
Collapse
Affiliation(s)
- Vinay Jahagirdar
- Department of Internal Medicine, University of Missouri Kansas City School of Medicine, Kansas City, Missouri, USA
| | - Jay Bapaye
- Department of Internal Medicine, Rochester General Hospital, Rochester, New York, USA
| | - Saurabh Chandan
- Department of Gastroenterology, Creighton University Medical Center, Creighton, Nebraska, USA
| | - Suresh Ponnada
- Internal Medicine, Roanoke Carilion Hospital, Roanoke, Virginia, USA
| | - Gursimran S Kochhar
- Department of Gastroenterology & Hepatology, Allegheny Health Network, Pittsburgh, Pennsylvania, USA
| | | | - Babu P Mohan
- Department of Gastroenterology & Hepatology, University of Utah, Salt Lake City, Utah, USA
| |
Collapse
|
38
|
Nemoto D, Guo Z, Katsuki S, Takezawa T, Maemoto R, Kawasaki K, Inoue K, Akutagawa T, Tanaka H, Sato K, Omori T, Takanashi K, Hayashi Y, Nakajima Y, Miyakura Y, Matsumoto T, Yoshida N, Esaki M, Uraoka T, Kato H, Inoue Y, Peng B, Zhang R, Hisabe T, Matsuda T, Yamamoto H, Tanaka N, Lefor AK, Zhu X, Togashi K. Computer-aided diagnosis of early-stage colorectal cancer using nonmagnified endoscopic white-light images (with videos). Gastrointest Endosc 2023; 98:90-99.e4. [PMID: 36738793 DOI: 10.1016/j.gie.2023.01.050] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Revised: 01/05/2023] [Accepted: 01/25/2023] [Indexed: 02/06/2023]
Abstract
BACKGROUND AND AIMS Differentiation of colorectal cancers (CRCs) with deep submucosal invasion (T1b) from CRCs with superficial invasion (T1a) or no invasion (Tis) is not straightforward. This study aimed to develop a computer-aided diagnosis (CADx) system to establish the diagnosis of early-stage cancers using nonmagnified endoscopic white-light images alone. METHODS From 5108 images, 1513 lesions (Tis, 1074; T1a, 145; T1b, 294) were collected from 1470 patients at 10 academic hospitals and assigned to training and testing datasets (3:1). The ResNet-50 network was used as the backbone to extract features from images. Oversampling and focal loss were used to compensate class imbalance of the invasive stage. Diagnostic performance was assessed using the testing dataset including 403 CRCs with 1392 images. Two experts and 2 trainees read the identical testing dataset. RESULTS At a 90% cutoff for the per-lesion score, CADx showed the highest specificity of 94.4% (95% confidence interval [CI], 91.3-96.6), with 59.8% (95% CI, 48.3-70.4) sensitivity and 87.3% (95% CI, 83.7-90.4) accuracy. The area under the characteristic curve was 85.1% (95% CI, 79.9-90.4) for CADx, 88.2% (95% CI, 83.7-92.8) for expert 1, 85.9% (95% CI, 80.9-90.9) for expert 2, 77.0% (95% CI, 71.5-82.4) for trainee 1 (vs CADx; P = .0076), and 66.2% (95% CI, 60.6-71.9) for trainee 2 (P < .0001). The function was also confirmed on 9 short videos. CONCLUSIONS A CADx system developed with endoscopic white-light images showed excellent per-lesion specificity and accuracy for T1b lesion diagnosis, equivalent to experts and superior to trainees. (Clinical trial registration number: UMIN000037053.).
Collapse
Affiliation(s)
- Daiki Nemoto
- Department of Coloproctology, Aizu Medical Center Fukushima Medical University, Aizuwakamatsu, Japan
| | - Zhe Guo
- Biomedical Information Engineering Lab, The University of Aizu, Aizuwakamatsu, Japan; Department of Laboratory Medicine, The Second Xiangya Hospital, Central South University, Changsha, China
| | - Shinichi Katsuki
- Department of Gastroenterology, Otaru Ekisaikai Hospital, Otaru, Japan
| | - Takahito Takezawa
- Department of Medicine, Division of Gastroenterology, Jichi Medical University, Shimotsuke, Japan
| | - Ryo Maemoto
- Department of Surgery, Saitama Medical Center, Jichi Medical University, Saitama, Japan
| | - Keisuke Kawasaki
- Department of Gastroenterology, Iwate Medical University, Morioka, Japan
| | - Ken Inoue
- Department of Molecular Gastroenterology and Hepatology, Kyoto Prefectural University of Medicine, Kyoto, Japan
| | - Takashi Akutagawa
- Division of Gastroenterology, Department of Internal Medicine, Faculty of Medicine, Saga University, Saga, Japan
| | - Hirohito Tanaka
- Department of Gastroenterology and Hepatology, Gunma University Graduate School of Medicine, Maebashi, Japan
| | - Koichiro Sato
- Department of Clinical Laboratory and Endoscopy, Tokyo Women's Medical University Medical Center East, Tokyo, Japan
| | - Teppei Omori
- Institute of Gastroenterology, Tokyo Women's Medical University, Tokyo, Japan
| | | | - Yoshikazu Hayashi
- Department of Medicine, Division of Gastroenterology, Jichi Medical University, Shimotsuke, Japan
| | - Yuki Nakajima
- Department of Coloproctology, Aizu Medical Center Fukushima Medical University, Aizuwakamatsu, Japan
| | - Yasuyuki Miyakura
- Department of Surgery, Saitama Medical Center, Jichi Medical University, Saitama, Japan
| | - Takayuki Matsumoto
- Department of Gastroenterology, Iwate Medical University, Morioka, Japan
| | - Naohisa Yoshida
- Department of Molecular Gastroenterology and Hepatology, Kyoto Prefectural University of Medicine, Kyoto, Japan
| | - Motohiro Esaki
- Division of Gastroenterology, Department of Internal Medicine, Faculty of Medicine, Saga University, Saga, Japan
| | - Toshio Uraoka
- Department of Gastroenterology and Hepatology, Gunma University Graduate School of Medicine, Maebashi, Japan
| | - Hiroyuki Kato
- Department of Clinical Laboratory and Endoscopy, Tokyo Women's Medical University Medical Center East, Tokyo, Japan
| | - Yuji Inoue
- Institute of Gastroenterology, Tokyo Women's Medical University, Tokyo, Japan
| | - Boyuan Peng
- Biomedical Information Engineering Lab, The University of Aizu, Aizuwakamatsu, Japan
| | - Ruiyao Zhang
- Biomedical Information Engineering Lab, The University of Aizu, Aizuwakamatsu, Japan
| | - Takashi Hisabe
- Department of Gastroenterology, Fukuoka University Chikushi Hospital, Fukuoka, Japan
| | - Tomoki Matsuda
- Department of Gastroenterology, Sendai Kosei Hospital, Sendai, Japan
| | - Hironori Yamamoto
- Department of Medicine, Division of Gastroenterology, Jichi Medical University, Shimotsuke, Japan
| | - Noriko Tanaka
- Health Data Science Research Section, Tokyo Metropolitan Institute of Gerontology, Tokyo, Japan
| | | | - Xin Zhu
- Biomedical Information Engineering Lab, The University of Aizu, Aizuwakamatsu, Japan
| | - Kazutomo Togashi
- Department of Coloproctology, Aizu Medical Center Fukushima Medical University, Aizuwakamatsu, Japan
| |
Collapse
|
39
|
Maida M, Marasco G, Facciorusso A, Shahini E, Sinagra E, Pallio S, Ramai D, Murino A. Effectiveness and application of artificial intelligence for endoscopic screening of colorectal cancer: the future is now. Expert Rev Anticancer Ther 2023; 23:719-729. [PMID: 37194308 DOI: 10.1080/14737140.2023.2215436] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2022] [Accepted: 05/15/2023] [Indexed: 05/18/2023]
Abstract
INTRODUCTION Artificial intelligence (AI) in gastrointestinal endoscopy includes systems designed to interpret medical images and increase sensitivity during examination. This may be a promising solution to human biases and may provide support during diagnostic endoscopy. AREAS COVERED This review aims to summarize and evaluate data supporting AI technologies in lower endoscopy, addressing their effectiveness, limitations, and future perspectives. EXPERT OPINION Computer-aided detection (CADe) systems have been studied with promising results, allowing for an increase in adenoma detection rate (ADR), adenoma per colonoscopy (APC), and a reduction in adenoma miss rate (AMR). This may lead to an increase in the sensitivity of endoscopic examinations and a reduction in the risk of interval-colorectal cancer. In addition, computer-aided characterization (CADx) has also been implemented, aiming to distinguish adenomatous and non-adenomatous lesions through real-time assessment using advanced endoscopic imaging techniques. Moreover, computer-aided quality (CADq) systems have been developed with the aim of standardizing quality measures in colonoscopy (e.g. withdrawal time and adequacy of bowel cleansing) both to improve the quality of examinations and set a reference standard for randomized controlled trials.
Collapse
Affiliation(s)
- Marcello Maida
- Gastroenterology and Endoscopy Unit, S. Elia-Raimondi Hospital, Caltanissetta, Italy
| | - Giovanni Marasco
- IRCCS Azienda Ospedaliero Universitaria di Bologna, Bologna, Italy
- Department of Medical and Surgical Sciences, University of Bologna, Bologna, Italy
| | - Antonio Facciorusso
- Department of Medical and Surgical Sciences, University of Foggia, Foggia, Italy
| | - Endrit Shahini
- Gastroenterology Unit, National Institute of Gastroenterology-IRCCS "Saverio de Bellis", Castellana Grotte, Bari, Italy
| | - Emanuele Sinagra
- Gastroenterology and Endoscopy Unit, Fondazione Istituto San Raffaele Giglio, Cefalu, Italy
| | - Socrate Pallio
- Digestive Diseases Endoscopy Unit, Policlinico G. Martino Hospital, University of Messina, Messina, Italy
| | - Daryl Ramai
- Gastroenterology & Hepatology, University of Utah Health, Salt Lake City, UT, USA
| | - Alberto Murino
- Royal Free Unit for Endoscopy, The Royal Free Hospital and University College London Institute for Liver and Digestive Health, Hampstead, London, UK
- Department of Gastroenterology, Cleveland Clinic London, London, UK
| |
Collapse
|
40
|
Dos Santos CEO, Malaman D, Arciniegas Sanmartin ID, Leão ABS, Leão GS, Pereira-Lima JC. Performance of artificial intelligence in the characterization of colorectal lesions. Saudi J Gastroenterol 2023; 29:219-224. [PMID: 37203122 PMCID: PMC10445495 DOI: 10.4103/sjg.sjg_316_22] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 04/10/2023] [Indexed: 05/20/2023] Open
Abstract
Background Image-enhanced endoscopy (IEE) has been used in the differentiation between neoplastic and non-neoplastic colorectal lesions through microvasculature analysis. This study aimed to evaluate the computer-aided diagnosis (CADx) mode of the CAD EYE system for the optical diagnosis of colorectal lesions and compare it with the performance of an expert, in addition to evaluating the computer-aided detection (CADe) mode in terms of polyp detection rate (PDR) and adenoma detection rate (ADR). Methods A prospective study was conducted to evaluate the performance of CAD EYE using blue light imaging (BLI), dichotomizing lesions into hyperplastic and neoplastic, and of an expert based on the Japan Narrow-Band Imaging Expert Team (JNET) classification for the characterization of lesions. After white light imaging (WLI) diagnosis, magnification was used on all lesions, which were removed and examined histologically. Diagnostic criteria were evaluated, and PDR and ADR were calculated. Results A total of 110 lesions (80 (72.7%) dysplastic lesions and 30 (27.3%) nondysplastic lesions) were evaluated in 52 patients, with a mean lesion size of 4.3 mm. Artificial intelligence (AI) analysis showed 81.8% accuracy, 76.3% sensitivity, 96.7% specificity, 98.5% positive predictive value (PPV), and 60.4% negative predictive value (NPV). The kappa value was 0.61, and the area under the receiver operating characteristic curve (AUC) was 0.87. Expert analysis showed 93.6% accuracy, 92.5% sensitivity, 96.7% specificity, 98.7% PPV, and 82.9% NPV. The kappa value was 0.85, and the AUC was 0.95. Overall, PDR was 67.6% and ADR was 45.9%. Conclusions The CADx mode showed good accuracy in characterizing colorectal lesions, but the expert assessment was superior in almost all diagnostic criteria. PDR and ADR were high.
Collapse
Affiliation(s)
- Carlos E. O. Dos Santos
- Department of Endoscopy, Santa Casa de Caridade Hospital, Bagé, RS, Brazil
- Department of Endoscopy, Pontifícia Universidade Católica do Rio Grande do Sul, Porto Alegre, RS, Brazil
| | - Daniele Malaman
- Department of Endoscopy, Santa Casa de Caridade Hospital, Bagé, RS, Brazil
| | | | - Ari B. S. Leão
- Department of Endoscopy, Pontifícia Universidade Católica do Rio Grande do Sul, Porto Alegre, RS, Brazil
| | - Gabriel S. Leão
- Department of Endoscopy, Pontifícia Universidade Católica do Rio Grande do Sul, Porto Alegre, RS, Brazil
| | - Júlio C. Pereira-Lima
- Department of Gastroenterology and Endoscopy, Santa Casa Hospital, Porto Alegre, RS, Brazil
| |
Collapse
|
41
|
Kader R, Cid‐Mejias A, Brandao P, Islam S, Hebbar S, Puyal JG, Ahmad OF, Hussein M, Toth D, Mountney P, Seward E, Vega R, Stoyanov D, Lovat LB. Polyp characterization using deep learning and a publicly accessible polyp video database. Dig Endosc 2023; 35:645-655. [PMID: 36527309 PMCID: PMC10570984 DOI: 10.1111/den.14500] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 12/13/2022] [Indexed: 01/20/2023]
Abstract
OBJECTIVES Convolutional neural networks (CNN) for computer-aided diagnosis of polyps are often trained using high-quality still images in a single chromoendoscopy imaging modality with sessile serrated lesions (SSLs) often excluded. This study developed a CNN from videos to classify polyps as adenomatous or nonadenomatous using standard narrow-band imaging (NBI) and NBI-near focus (NBI-NF) and created a publicly accessible polyp video database. METHODS We trained a CNN with 16,832 high and moderate quality frames from 229 polyp videos (56 SSLs). It was evaluated with 222 polyp videos (36 SSLs) across two test-sets. Test-set I consists of 14,320 frames (157 polyps, 111 diminutive). Test-set II, which is publicly accessible, 3317 video frames (65 polyps, 41 diminutive), which was benchmarked with three expert and three nonexpert endoscopists. RESULTS Sensitivity for adenoma characterization was 91.6% in test-set I and 89.7% in test-set II. Specificity was 91.9% and 88.5%. Sensitivity for diminutive polyps was 89.9% and 87.5%; specificity 90.5% and 88.2%. In NBI-NF, sensitivity was 89.4% and 89.5%, with a specificity of 94.7% and 83.3%. In NBI, sensitivity was 85.3% and 91.7%, with a specificity of 87.5% and 90.0%, respectively. The CNN achieved preservation and incorporation of valuable endoscopic innovations (PIVI)-1 and PIVI-2 thresholds for each test-set. In the benchmarking of test-set II, the CNN was significantly more accurate than nonexperts (13.8% difference [95% confidence interval 3.2-23.6], P = 0.01) with no significant difference with experts. CONCLUSIONS A single CNN can differentiate adenomas from SSLs and hyperplastic polyps in both NBI and NBI-NF. A publicly accessible NBI polyp video database was created and benchmarked.
Collapse
Affiliation(s)
- Rawen Kader
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | | | | | - Shahraz Islam
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
| | | | - Juana González‐Bueno Puyal
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Odin Vision LtdLondonUK
| | - Omer F. Ahmad
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Mohamed Hussein
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | | | | | - Ed Seward
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Roser Vega
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
| | - Laurence B. Lovat
- Wellcome/EPSRC Centre for Interventional and Surgical SciencesUniversity College LondonLondonUK
- Division of Surgery and Interventional SciencesUniversity College LondonLondonUK
- Gastrointestinal ServicesUniversity College London HospitalLondonUK
| |
Collapse
|
42
|
Du RC, Ouyang YB, Hu Y. Research trends on artificial intelligence and endoscopy in digestive diseases: A bibliometric analysis from 1990 to 2022. World J Gastroenterol 2023; 29:3561-3573. [PMID: 37389238 PMCID: PMC10303508 DOI: 10.3748/wjg.v29.i22.3561] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/04/2023] [Revised: 04/03/2023] [Accepted: 05/04/2023] [Indexed: 06/06/2023] Open
Abstract
BACKGROUND Recently, artificial intelligence (AI) has been widely used in gastrointestinal endoscopy examinations.
AIM To comprehensively evaluate the application of AI-assisted endoscopy in detecting different digestive diseases using bibliometric analysis.
METHODS Relevant publications from the Web of Science published from 1990 to 2022 were extracted using a combination of the search terms “AI” and “endoscopy”. The following information was recorded from the included publications: Title, author, institution, country, endoscopy type, disease type, performance of AI, publication, citation, journal and H-index.
RESULTS A total of 446 studies were included. The number of articles reached its peak in 2021, and the annual citation numbers increased after 2006. China, the United States and Japan were dominant countries in this field, accounting for 28.7%, 16.8%, and 15.7% of publications, respectively. The Tada Tomohiro Institute of Gastroenterology and Proctology was the most influential institution. “Cancer” and “polyps” were the hotspots in this field. Colorectal polyps were the most concerning and researched disease, followed by gastric cancer and gastrointestinal bleeding. Conventional endoscopy was the most common type of examination. The accuracy of AI in detecting Barrett’s esophagus, colorectal polyps and gastric cancer from 2018 to 2022 is 87.6%, 93.7% and 88.3%, respectively. The detection rates of adenoma and gastrointestinal bleeding from 2018 to 2022 are 31.3% and 96.2%, respectively.
CONCLUSION AI could improve the detection rate of digestive tract diseases and a convolutional neural network-based diagnosis program for endoscopic images shows promising results.
Collapse
Affiliation(s)
- Ren-Chun Du
- Department of Gastroenterology, The First Affiliated Hospital of Nanchang University, Nanchang 330006, Jiangxi Province, China
| | - Yao-Bin Ouyang
- Department of Gastroenterology, The First Affiliated Hospital of Nanchang University, Nanchang 330006, Jiangxi Province, China
- Department of Oncology, Mayo Clinic, Rochester, MN 55905, United States
| | - Yi Hu
- Department of Gastroenterology, The First Affiliated Hospital of Nanchang University, Nanchang 330006, Jiangxi Province, China
- Department of Surgery, The Chinese University of Hong Kong, Hong Kong 999077, China
| |
Collapse
|
43
|
Sinonquel P, Vermeire S, Maes F, Bisschops R. Advanced Imaging in Gastrointestinal Endoscopy: A Literature Review of the Current State of the Art. GE PORTUGUESE JOURNAL OF GASTROENTEROLOGY 2023; 30:175-191. [PMID: 37387720 PMCID: PMC10305270 DOI: 10.1159/000527083] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Accepted: 09/11/2022] [Indexed: 01/03/2025]
Abstract
BACKGROUND AND AIMS Gastrointestinal (GI) endoscopy has known a great evolution in the last decades. Imaging techniques evolved from imaging with only standard white light endoscopes toward high-definition resolution endoscopes and the use of multiple color enhancement techniques, over to automated endoscopic assessment systems based on artificial intelligence. This narrative literature review aimed to provide a detailed overview on the latest evolutions within the field of advanced GI endoscopy, mainly focusing on the screening, diagnosis, and surveillance of common upper and lower GI pathology. METHODS This review comprises only literature about screening, diagnosis, and surveillance strategies using advanced endoscopic imaging techniques published in (inter)national peer-reviewed journals and written in English. Studies with only adult patients included were selected. A search was performed using MESH terms: dye-based chromoendoscopy, virtual chromoendoscopy, video enhancement technique, upper GI tract, lower GI tract, Barrett's esophagus, esophageal squamous cell carcinoma, gastric cancer, colorectal polyps, inflammatory bowel disease, artificial intelligence. This review does not elaborate on the therapeutic application or impact of advanced GI endoscopy. CONCLUSIONS Focusing on current and future applications and evolutions in the field of both upper and lower GI advanced endoscopy, this overview is a practical but detailed projection of the latest developments. Within this review, an active leap toward artificial intelligence and its recent developments in GI endoscopy was made. Additionally, the literature is weighted against the current international guidelines and assessed for its potential positive future impact.
Collapse
Affiliation(s)
- Pieter Sinonquel
- Department of Gastroenterology and Hepatology, University Hospitals Leuven, Leuven, Belgium
- Department of Translational Research in Gastrointestinal Diseases (TARGID), KU Leuven, Leuven, Belgium
| | - Séverine Vermeire
- Department of Gastroenterology and Hepatology, University Hospitals Leuven, Leuven, Belgium
- Department of Translational Research in Gastrointestinal Diseases (TARGID), KU Leuven, Leuven, Belgium
| | - Frederik Maes
- Department of Electrical Engineering (ESAT), KU Leuven, Leuven, Belgium
| | - Raf Bisschops
- Department of Gastroenterology and Hepatology, University Hospitals Leuven, Leuven, Belgium
- Department of Translational Research in Gastrointestinal Diseases (TARGID), KU Leuven, Leuven, Belgium
| |
Collapse
|
44
|
Sharma A, Kumar R, Yadav G, Garg P. Artificial intelligence in intestinal polyp and colorectal cancer prediction. Cancer Lett 2023; 565:216238. [PMID: 37211068 DOI: 10.1016/j.canlet.2023.216238] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Revised: 05/17/2023] [Accepted: 05/17/2023] [Indexed: 05/23/2023]
Abstract
Artificial intelligence (AI) algorithms and their application to disease detection and decision support for healthcare professions have greatly evolved in the recent decade. AI has been widely applied and explored in gastroenterology for endoscopic analysis to diagnose intestinal cancers, premalignant polyps, gastrointestinal inflammatory lesions, and bleeding. Patients' responses to treatments and prognoses have both been predicted using AI by combining multiple algorithms. In this review, we explored the recent applications of AI algorithms in the identification and characterization of intestinal polyps and colorectal cancer predictions. AI-based prediction models have the potential to help medical practitioners diagnose, establish prognoses, and find accurate conclusions for the treatment of patients. With the understanding that rigorous validation of AI approaches using randomized controlled studies is solicited before widespread clinical use by health authorities, the article also discusses the limitations and challenges associated with deploying AI systems to diagnose intestinal malignancies and premalignant lesions.
Collapse
Affiliation(s)
- Anju Sharma
- Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, S.A.S Nagar, 160062, Punjab, India
| | - Rajnish Kumar
- Amity Institute of Biotechnology, Amity University Uttar Pradesh, Lucknow Campus, Uttar Pradesh, 226010, India; Department of Veterinary Medicine and Surgery, College of Veterinary Medicine, University of Missouri, Columbia, MO, USA
| | - Garima Yadav
- Amity Institute of Biotechnology, Amity University Uttar Pradesh, Lucknow Campus, Uttar Pradesh, 226010, India
| | - Prabha Garg
- Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, S.A.S Nagar, 160062, Punjab, India.
| |
Collapse
|
45
|
Gimeno-García AZ, Hernández-Pérez A, Nicolás-Pérez D, Hernández-Guerra M. Artificial Intelligence Applied to Colonoscopy: Is It Time to Take a Step Forward? Cancers (Basel) 2023; 15:cancers15082193. [PMID: 37190122 DOI: 10.3390/cancers15082193] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Revised: 04/04/2023] [Accepted: 04/05/2023] [Indexed: 05/17/2023] Open
Abstract
Growing evidence indicates that artificial intelligence (AI) applied to medicine is here to stay. In gastroenterology, AI computer vision applications have been stated as a research priority. The two main AI system categories are computer-aided polyp detection (CADe) and computer-assisted diagnosis (CADx). However, other fields of expansion are those related to colonoscopy quality, such as methods to objectively assess colon cleansing during the colonoscopy, as well as devices to automatically predict and improve bowel cleansing before the examination, predict deep submucosal invasion, obtain a reliable measurement of colorectal polyps and accurately locate colorectal lesions in the colon. Although growing evidence indicates that AI systems could improve some of these quality metrics, there are concerns regarding cost-effectiveness, and large and multicentric randomized studies with strong outcomes, such as post-colonoscopy colorectal cancer incidence and mortality, are lacking. The integration of all these tasks into one quality-improvement device could facilitate the incorporation of AI systems in clinical practice. In this manuscript, the current status of the role of AI in colonoscopy is reviewed, as well as its current applications, drawbacks and areas for improvement.
Collapse
Affiliation(s)
- Antonio Z Gimeno-García
- Gastroenterology Department, Hospital Universitario de Canarias, 38200 San Cristóbal de La Laguna, Tenerife, Spain
- Instituto Universitario de Tecnologías Biomédicas (ITB) & Centro de Investigación Biomédica de Canarias (CIBICAN), Internal Medicine Department, Universidad de La Laguna, 38200 San Cristóbal de La Laguna, Tenerife, Spain
| | - Anjara Hernández-Pérez
- Gastroenterology Department, Hospital Universitario de Canarias, 38200 San Cristóbal de La Laguna, Tenerife, Spain
- Instituto Universitario de Tecnologías Biomédicas (ITB) & Centro de Investigación Biomédica de Canarias (CIBICAN), Internal Medicine Department, Universidad de La Laguna, 38200 San Cristóbal de La Laguna, Tenerife, Spain
| | - David Nicolás-Pérez
- Gastroenterology Department, Hospital Universitario de Canarias, 38200 San Cristóbal de La Laguna, Tenerife, Spain
- Instituto Universitario de Tecnologías Biomédicas (ITB) & Centro de Investigación Biomédica de Canarias (CIBICAN), Internal Medicine Department, Universidad de La Laguna, 38200 San Cristóbal de La Laguna, Tenerife, Spain
| | - Manuel Hernández-Guerra
- Gastroenterology Department, Hospital Universitario de Canarias, 38200 San Cristóbal de La Laguna, Tenerife, Spain
- Instituto Universitario de Tecnologías Biomédicas (ITB) & Centro de Investigación Biomédica de Canarias (CIBICAN), Internal Medicine Department, Universidad de La Laguna, 38200 San Cristóbal de La Laguna, Tenerife, Spain
| |
Collapse
|
46
|
Mazumdar S, Sinha S, Jha S, Jagtap B. Computer-aided automated diminutive colonic polyp detection in colonoscopy by using deep machine learning system; first indigenous algorithm developed in India. Indian J Gastroenterol 2023; 42:226-232. [PMID: 37145230 DOI: 10.1007/s12664-022-01331-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Accepted: 12/18/2022] [Indexed: 05/06/2023]
Abstract
BACKGROUND Colonic polyps can be detected and resected during a colonoscopy before cancer development. However, about 1/4th of the polyps could be missed due to their small size, location or human errors. An artificial intelligence (AI) system can improve polyp detection and reduce colorectal cancer incidence. We are developing an indigenous AI system to detect diminutive polyps in real-life scenarios that can be compatible with any high-definition colonoscopy and endoscopic video- capture software. METHODS We trained a masked region-based convolutional neural network model to detect and localize colonic polyps. Three independent datasets of colonoscopy videos comprising 1,039 image frames were used and divided into a training dataset of 688 frames and a testing dataset of 351 frames. Of 1,039 image frames, 231 were from real-life colonoscopy videos from our centre. The rest were from publicly available image frames already modified to be directly utilizable for developing the AI system. The image frames of the testing dataset were also augmented by rotating and zooming the images to replicate real-life distortions of images seen during colonoscopy. The AI system was trained to localize the polyp by creating a 'bounding box'. It was then applied to the testing dataset to test its accuracy in detecting polyps automatically. RESULTS The AI system achieved a mean average precision (equivalent to specificity) of 88.63% for automatic polyp detection. All polyps in the testing were identified by AI, i.e., no false-negative result in the testing dataset (sensitivity of 100%). The mean polyp size in the study was 5 (± 4) mm. The mean processing time per image frame was 96.4 minutes. CONCLUSIONS This AI system, when applied to real-life colonoscopy images, having wide variations in bowel preparation and small polyp size, can detect colonic polyps with a high degree of accuracy.
Collapse
Affiliation(s)
- Srijan Mazumdar
- Indian Institute of Liver and Digestive Sciences, Sitala (East), Jagadishpur, Sonarpur, 24 Parganas (South), Kolkata, 700 150, India.
| | - Saugata Sinha
- Visvesvaraya National Institute of Technology, South Ambazari Road, Nagpur, 440 010, India
| | - Saurabh Jha
- Visvesvaraya National Institute of Technology, South Ambazari Road, Nagpur, 440 010, India
| | - Balaji Jagtap
- Visvesvaraya National Institute of Technology, South Ambazari Road, Nagpur, 440 010, India
| |
Collapse
|
47
|
Gong R, He S, Tian T, Chen J, Hao Y, Qiao C. FRCNN-AA-CIF: An automatic detection model of colon polyps based on attention awareness and context information fusion. Comput Biol Med 2023; 158:106787. [PMID: 37044051 DOI: 10.1016/j.compbiomed.2023.106787] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Revised: 03/03/2023] [Accepted: 03/11/2023] [Indexed: 04/08/2023]
Abstract
It is noted that the foreground and background of the polyp images detected under colonoscopy are not highly differentiated, and the feature map extracted by common deep learning object detection models keep getting smaller as the number of networks increases. Therefore, these models tend to ignore the details in pictures, resulting in a high polyp missed detection rate. To reduce the missed detection rate, this paper proposes an automatic detection model of colon polyps based on attention awareness and context information fusion (FRCNN-AA-CIF) based on a two-stage object detection model Faster Region-Convolutional Neural Network (FR-CNN). First, since the addition of attention awareness can make the feature extraction network pay more attention to polyp features, we propose an attention awareness module based on Squeeze-and-Excitation Network (SENet) and Efficient Channel Attention Module (ECA-Net) and add it after each block of the backbone network. Specifically, we first use the 1*1 convolution of ECA-Net to extract local cross-channel information and then use the two fully connected layers of SENet to reduce and increase the dimension, to filter out the channels that are more useful for feature learning. Further, because of the presence of air bubbles, impurities, inflammation, and accumulation of digestive matter around polyps, we used context information around polyps to enhance the focus on polyp features. In particular, after the network extracts the region of interest, we fuse the region of interest with its context information to improve the detection rate of polyps. The proposed model was tested on the colonoscopy dataset provided by Huashan Hospital. Numerical experiments show that FRCNN-AA-CIF has the highest detection accuracy (mAP of 0.817), the lowest missed detection rate of 4.22%, and the best classification effect (AUC of 95.98%). Its mAP increased by 3.3%, MDR decreased by 1.97%, and AUC increased by 1.8%. Compared with other object detection models, FRCNN-AA-CIF has significantly improved recognition accuracy and reduced missed detection rate.
Collapse
|
48
|
Gan P, Li P, Xia H, Zhou X, Tang X. The application of artificial intelligence in improving colonoscopic adenoma detection rate: Where are we and where are we going. GASTROENTEROLOGIA Y HEPATOLOGIA 2023; 46:203-213. [PMID: 35489584 DOI: 10.1016/j.gastrohep.2022.03.009] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/01/2021] [Revised: 03/08/2022] [Accepted: 03/18/2022] [Indexed: 02/08/2023]
Abstract
Colorectal cancer (CRC) is one of the common malignant tumors in the world. Colonoscopy is the crucial examination technique in CRC screening programs for the early detection of precursor lesions, and treatment of early colorectal cancer, which can reduce the morbidity and mortality of CRC significantly. However, pooled polyp miss rates during colonoscopic examination are as high as 22%. Artificial intelligence (AI) provides a promising way to improve the colonoscopic adenoma detection rate (ADR). It might assist endoscopists in avoiding missing polyps and offer an accurate optical diagnosis of suspected lesions. Herein, we described some of the milestone studies in using AI for colonoscopy, and the future application directions of AI in improving colonoscopic ADR.
Collapse
Affiliation(s)
- Peiling Gan
- Department of Gastroenterology, Affiliated Hospital of Southwest Medical University, Luzhou, China
| | - Peiling Li
- Department of Gastroenterology, Affiliated Hospital of Southwest Medical University, Luzhou, China
| | - Huifang Xia
- Department of Gastroenterology, Affiliated Hospital of Southwest Medical University, Luzhou, China
| | - Xian Zhou
- Department of Gastroenterology, Affiliated Hospital of Southwest Medical University, Luzhou, China
| | - Xiaowei Tang
- Department of Gastroenterology, Affiliated Hospital of Southwest Medical University, Luzhou, China; Department of Gastroenterology, The First Medical Center of Chinese PLA General Hospital, Beijing, China.
| |
Collapse
|
49
|
Huang P, Feng Z, Shu X, Wu A, Wang Z, Hu T, Cao Y, Tu Y, Li Z. A bibliometric and visual analysis of publications on artificial intelligence in colorectal cancer (2002-2022). Front Oncol 2023; 13:1077539. [PMID: 36824138 PMCID: PMC9941644 DOI: 10.3389/fonc.2023.1077539] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2022] [Accepted: 01/27/2023] [Indexed: 02/10/2023] Open
Abstract
Background Colorectal cancer (CRC) has the third-highest incidence and second-highest mortality rate of all cancers worldwide. Early diagnosis and screening of CRC have been the focus of research in this field. With the continuous development of artificial intelligence (AI) technology, AI has advantages in many aspects of CRC, such as adenoma screening, genetic testing, and prediction of tumor metastasis. Objective This study uses bibliometrics to analyze research in AI in CRC, summarize the field's history and current status of research, and predict future research directions. Method We searched the SCIE database for all literature on CRC and AI. The documents span the period 2002-2022. we used bibliometrics to analyze the data of these papers, such as authors, countries, institutions, and references. Co-authorship, co-citation, and co-occurrence analysis were the main methods of analysis. Citespace, VOSviewer, and SCImago Graphica were used to visualize the results. Result This study selected 1,531 articles on AI in CRC. China has published a maximum number of 580 such articles in this field. The U.S. had the most quality publications, boasting an average citation per article of 46.13. Mori Y and Ding K were the two authors with the highest number of articles. Scientific Reports, Cancers, and Frontiers in Oncology are this field's most widely published journals. Institutions from China occupy the top 9 positions among the most published institutions. We found that research on AI in this field mainly focuses on colonoscopy-assisted diagnosis, imaging histology, and pathology examination. Conclusion AI in CRC is currently in the development stage with good prospects. AI is currently widely used in colonoscopy, imageomics, and pathology. However, the scope of AI applications is still limited, and there is a lack of inter-institutional collaboration. The pervasiveness of AI technology is the main direction of future housing development in this field.
Collapse
Affiliation(s)
- Pan Huang
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Zongfeng Feng
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Xufeng Shu
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Ahao Wu
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Zhonghao Wang
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Tengcheng Hu
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Yi Cao
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China
| | - Yi Tu
- Department of Pathology, The First Affiliated Hospital of Nanchang University, Nanchang, China,*Correspondence: Yi Tu, ; Zhengrong Li,
| | - Zhengrong Li
- Department of General Surgery, First Affiliated Hospital of Nanchang University, Nanchang, China,Department of Digestive Surgery, Digestive Disease Hospital, The First Affiliated Hospital of Nanchang University, Nanchang, China,Medical Innovation Center, The First Affiliated Hospital of Nanchang University, Nanchang, China,*Correspondence: Yi Tu, ; Zhengrong Li,
| |
Collapse
|
50
|
Mansur A, Saleem Z, Elhakim T, Daye D. Role of artificial intelligence in risk prediction, prognostication, and therapy response assessment in colorectal cancer: current state and future directions. Front Oncol 2023; 13:1065402. [PMID: 36761957 PMCID: PMC9905815 DOI: 10.3389/fonc.2023.1065402] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2022] [Accepted: 01/09/2023] [Indexed: 01/26/2023] Open
Abstract
Artificial Intelligence (AI) is a branch of computer science that utilizes optimization, probabilistic and statistical approaches to analyze and make predictions based on a vast amount of data. In recent years, AI has revolutionized the field of oncology and spearheaded novel approaches in the management of various cancers, including colorectal cancer (CRC). Notably, the applications of AI to diagnose, prognosticate, and predict response to therapy in CRC, is gaining traction and proving to be promising. There have also been several advancements in AI technologies to help predict metastases in CRC and in Computer-Aided Detection (CAD) Systems to improve miss rates for colorectal neoplasia. This article provides a comprehensive review of the role of AI in predicting risk, prognosis, and response to therapies among patients with CRC.
Collapse
Affiliation(s)
- Arian Mansur
- Harvard Medical School, Boston, MA, United States
| | | | - Tarig Elhakim
- Department of Radiology, Massachusetts General Hospital, Boston, MA, United States
| | - Dania Daye
- Department of Radiology, Massachusetts General Hospital, Boston, MA, United States
| |
Collapse
|