1
|
Bobrow TL, Golhar M, Vijayan R, Akshintala VS, Garcia JR, Durr NJ. Colonoscopy 3D video dataset with paired depth from 2D-3D registration. Med Image Anal 2023; 90:102956. [PMID: 37713764 PMCID: PMC10591895 DOI: 10.1016/j.media.2023.102956] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 06/29/2023] [Accepted: 09/04/2023] [Indexed: 09/17/2023]
Abstract
Screening colonoscopy is an important clinical application for several 3D computer vision techniques, including depth estimation, surface reconstruction, and missing region detection. However, the development, evaluation, and comparison of these techniques in real colonoscopy videos remain largely qualitative due to the difficulty of acquiring ground truth data. In this work, we present a Colonoscopy 3D Video Dataset (C3VD) acquired with a high definition clinical colonoscope and high-fidelity colon models for benchmarking computer vision methods in colonoscopy. We introduce a novel multimodal 2D-3D registration technique to register optical video sequences with ground truth rendered views of a known 3D model. The different modalities are registered by transforming optical images to depth maps with a Generative Adversarial Network and aligning edge features with an evolutionary optimizer. This registration method achieves an average translation error of 0.321 millimeters and an average rotation error of 0.159 degrees in simulation experiments where error-free ground truth is available. The method also leverages video information, improving registration accuracy by 55.6% for translation and 60.4% for rotation compared to single frame registration. 22 short video sequences were registered to generate 10,015 total frames with paired ground truth depth, surface normals, optical flow, occlusion, six degree-of-freedom pose, coverage maps, and 3D models. The dataset also includes screening videos acquired by a gastroenterologist with paired ground truth pose and 3D surface models. The dataset and registration source code are available at https://durr.jhu.edu/C3VD.
Collapse
Affiliation(s)
- Taylor L Bobrow
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Mayank Golhar
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Rohan Vijayan
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Venkata S Akshintala
- Division of Gastroenterology and Hepatology, Johns Hopkins Medicine, Baltimore, MD 21287, USA
| | - Juan R Garcia
- Department of Art as Applied to Medicine, Johns Hopkins School of Medicine, Baltimore, MD 21287, USA
| | - Nicholas J Durr
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218, USA.
| |
Collapse
|
2
|
Galati JS, Lin K, Gross SA. Recent advances in devices and technologies that might prove revolutionary for colonoscopy procedures. Expert Rev Med Devices 2023; 20:1087-1103. [PMID: 37934873 DOI: 10.1080/17434440.2023.2280773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 11/03/2023] [Indexed: 11/09/2023]
Abstract
INTRODUCTION Colorectal cancer (CRC) is the third most common malignancy and second leading cause of cancer-related mortality in the world. Adenoma detection rate (ADR), a quality indicator for colonoscopy, has gained prominence as it is inversely related to CRC incidence and mortality. As such, recent efforts have focused on developing novel colonoscopy devices and technologies to improve ADR. AREAS COVERED The main objective of this paper is to provide an overview of advancements in the fields of colonoscopy mechanical attachments, artificial intelligence-assisted colonoscopy, and colonoscopy optical enhancements with respect to ADR. We accomplished this by performing a comprehensive search of multiple electronic databases from inception to September 2023. This review is intended to be an introduction to colonoscopy devices and technologies. EXPERT OPINION Numerous mechanical attachments and optical enhancements have been developed that have the potential to improve ADR and AI has gone from being an inaccessible concept to a feasible means for improving ADR. While these advances are exciting and portend a change in what will be considered standard colonoscopy, they continue to require refinement. Future studies should focus on combining modalities to further improve ADR and exploring the use of these technologies in other facets of colonoscopy.
Collapse
Affiliation(s)
- Jonathan S Galati
- Department of Internal Medicine, NYU Langone Health, New York, NY, USA
| | - Kevin Lin
- Department of Internal Medicine, NYU Langone Health, New York, NY, USA
| | - Seth A Gross
- Division of Gastroenterology, NYU Langone Health, New York, NY, USA
| |
Collapse
|
3
|
Xie L, Ge T, Xiao B, Han X, Zhang Q, Xu Z, He D, Tian W. Identification of Adolescent Menarche Status Using Biplanar X-ray Images: A Deep Learning-Based Method. Bioengineering (Basel) 2023; 10:769. [PMID: 37508796 PMCID: PMC10375958 DOI: 10.3390/bioengineering10070769] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2023] [Revised: 06/21/2023] [Accepted: 06/22/2023] [Indexed: 07/30/2023] Open
Abstract
The purpose of this study is to develop an automated method for identifying the menarche status of adolescents based on EOS radiographs. We designed a deep-learning-based algorithm that contains a region of interest detection network and a classification network. The algorithm was trained and tested on a retrospective dataset of 738 adolescent EOS cases using a five-fold cross-validation strategy and was subsequently tested on a clinical validation set of 259 adolescent EOS cases. On the clinical validation set, our algorithm achieved accuracy of 0.942, macro precision of 0.933, macro recall of 0.938, and a macro F1-score of 0.935. The algorithm showed almost perfect performance in distinguishing between males and females, with the main classification errors found in females aged 12 to 14 years. Specifically for females, the algorithm had accuracy of 0.910, sensitivity of 0.943, and specificity of 0.855 in estimating menarche status, with an area under the curve of 0.959. The kappa value of the algorithm, in comparison to the actual situation, was 0.806, indicating strong agreement between the algorithm and the real-world scenario. This method can efficiently analyze EOS radiographs and identify the menarche status of adolescents. It is expected to become a routine clinical tool and provide references for doctors' decisions under specific clinical conditions.
Collapse
Affiliation(s)
- Linzhen Xie
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Tenghui Ge
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Bin Xiao
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Xiaoguang Han
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Qi Zhang
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Zhongning Xu
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Da He
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| | - Wei Tian
- Department of Spine Surgery, Peking University Fourth School of Clinical Medicine, Beijing 100035, China
- Department of Spine Surgery, Beijing Jishuitan Hospital, Beijing 100035, China
- Research Unit of Intelligent Orthopedics, Chinese Academy of Medical Sciences, Beijing 100035, China
| |
Collapse
|
4
|
Pearce FJ, Cruz Rivera S, Liu X, Manna E, Denniston AK, Calvert MJ. The role of patient-reported outcome measures in trials of artificial intelligence health technologies: a systematic evaluation of ClinicalTrials.gov records (1997-2022). Lancet Digit Health 2023; 5:e160-e167. [PMID: 36828608 DOI: 10.1016/s2589-7500(22)00249-7] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Revised: 09/29/2022] [Accepted: 12/07/2022] [Indexed: 02/24/2023]
Abstract
The extent to which patient-reported outcome measures (PROMs) are used in clinical trials for artificial intelligence (AI) technologies is unknown. In this systematic evaluation, we aim to establish how PROMs are being used to assess AI health technologies. We searched ClinicalTrials.gov for interventional trials registered from inception to Sept 20, 2022, and included trials that tested an AI health technology. We excluded observational studies, patient registries, and expanded access reports. We extracted data regarding the form, function, and intended use population of the AI health technology, in addition to the PROMs used and whether PROMs were incorporated as an input or output in the AI model. The search identified 2958 trials, of which 627 were included in the analysis. 152 (24%) of the included trials used one or more PROM, visual analogue scale, patient-reported experience measure, or usability measure as a trial endpoint. The type of AI health technologies used by these trials included AI-enabled smart devices, clinical decision support systems, and chatbots. The number of clinical trials of AI health technologies registered on ClinicalTrials.gov and the proportion of trials that used PROMs increased from registry inception to 2022. The most common clinical areas AI health technologies were designed for were digestive system health for non-PROM trials and musculoskeletal health (followed by mental and behavioural health) for PROM trials, with PROMs commonly used in clinical areas for which assessment of health-related quality of life and symptom burden is particularly important. Additionally, AI-enabled smart devices were the most common applications tested in trials that used at least one PROM. 24 trials tested AI models that captured PROM data as an input for the AI model. PROM use in clinical trials of AI health technologies falls behind PROM use in all clinical trials. Trial records having inadequate detail regarding the PROMs used or the type of AI health technology tested was a limitation of this systematic evaluation and might have contributed to inaccuracies in the data synthesised. Overall, the use of PROMs in the function and assessment of AI health technologies is not only possible, but is a powerful way of showing that, even in the most technologically advanced health-care systems, patients' perspectives remain central.
Collapse
Affiliation(s)
| | - Samantha Cruz Rivera
- Centre for Patient Reported Outcomes Research, Institute of Applied Health Research, University of Birmingham, Birmingham, UK; Birmingham Health Partners Centre for Regulatory Science and Innovation, University of Birmingham, Birmingham, UK; Data-Enabled Medical Technologies and Devices Hub, University of Birmingham, Birmingham, UK.
| | - Xiaoxuan Liu
- Academic Unit of Ophthalmology, Institute of Inflammation and Ageing, University of Birmingham, Birmingham, UK; University Hospitals Birmingham NHS Foundation Trust, Birmingham, UK
| | - Elaine Manna
- Centre for Patient Reported Outcomes Research, Institute of Applied Health Research, University of Birmingham, Birmingham, UK
| | - Alastair K Denniston
- Centre for Patient Reported Outcomes Research, Institute of Applied Health Research, University of Birmingham, Birmingham, UK; Birmingham Health Partners Centre for Regulatory Science and Innovation, University of Birmingham, Birmingham, UK; Data-Enabled Medical Technologies and Devices Hub, University of Birmingham, Birmingham, UK; Academic Unit of Ophthalmology, Institute of Inflammation and Ageing, University of Birmingham, Birmingham, UK; University Hospitals Birmingham NHS Foundation Trust, Birmingham, UK; Health Data Research UK, London, UK; National Institute for Health and Care Research Biomedical Research Centre for Ophthalmology, Moorfields Hospital London NHS Foundation Trust and Institute of Ophthalmology, University College London, London, UK
| | - Melanie J Calvert
- Centre for Patient Reported Outcomes Research, Institute of Applied Health Research, University of Birmingham, Birmingham, UK; Birmingham Health Partners Centre for Regulatory Science and Innovation, University of Birmingham, Birmingham, UK; Data-Enabled Medical Technologies and Devices Hub, University of Birmingham, Birmingham, UK; National Institute for Health and Care Research Applied Research Collaboration West Midlands, University of Birmingham, Birmingham, UK; National Institute for Health and Care Research Birmingham Biomedical Research Centre, University Hospitals Birmingham NHS Foundation Trust, Birmingham, UK; National Institute for Health and Care Research Surgical Reconstruction and Microbiology Centre, University Hospitals Birmingham NHS Foundation Trust, Birmingham, UK; Health Data Research UK, London, UK; National Institute for Health and Care Research Biomedical Research Centre for Ophthalmology, Moorfields Hospital London NHS Foundation Trust and Institute of Ophthalmology, University College London, London, UK; National Institute for Health and Care Research Birmingham-Oxford Blood and Transplant Research Unit in Precision Transplant and Cellular Therapeutics, Birmingham, UK
| |
Collapse
|
5
|
González-Bueno Puyal J, Brandao P, Ahmad OF, Bhatia KK, Toth D, Kader R, Lovat L, Mountney P, Stoyanov D. Spatio-temporal classification for polyp diagnosis. BIOMEDICAL OPTICS EXPRESS 2023; 14:593-607. [PMID: 36874484 PMCID: PMC9979670 DOI: 10.1364/boe.473446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/17/2022] [Revised: 11/25/2022] [Accepted: 12/06/2022] [Indexed: 06/18/2023]
Abstract
Colonoscopy remains the gold standard investigation for colorectal cancer screening as it offers the opportunity to both detect and resect pre-cancerous polyps. Computer-aided polyp characterisation can determine which polyps need polypectomy and recent deep learning-based approaches have shown promising results as clinical decision support tools. Yet polyp appearance during a procedure can vary, making automatic predictions unstable. In this paper, we investigate the use of spatio-temporal information to improve the performance of lesions classification as adenoma or non-adenoma. Two methods are implemented showing an increase in performance and robustness during extensive experiments both on internal and openly available benchmark datasets.
Collapse
Affiliation(s)
- Juana González-Bueno Puyal
- Wellcome/EPSRC Centre for Interventional
and Surgical Sciences (WEISS), University College London, London
W1W 7TY, UK
- Odin Vision, London W1W 7TY, UK
| | | | - Omer F. Ahmad
- Wellcome/EPSRC Centre for Interventional
and Surgical Sciences (WEISS), University College London, London
W1W 7TY, UK
| | | | | | - Rawen Kader
- Wellcome/EPSRC Centre for Interventional
and Surgical Sciences (WEISS), University College London, London
W1W 7TY, UK
| | - Laurence Lovat
- Wellcome/EPSRC Centre for Interventional
and Surgical Sciences (WEISS), University College London, London
W1W 7TY, UK
| | | | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional
and Surgical Sciences (WEISS), University College London, London
W1W 7TY, UK
| |
Collapse
|
6
|
Krenzer A, Banck M, Makowski K, Hekalo A, Fitting D, Troya J, Sudarevic B, Zoller WG, Hann A, Puppe F. A Real-Time Polyp-Detection System with Clinical Application in Colonoscopy Using Deep Convolutional Neural Networks. J Imaging 2023; 9:jimaging9020026. [PMID: 36826945 PMCID: PMC9967208 DOI: 10.3390/jimaging9020026] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2022] [Revised: 01/18/2023] [Accepted: 01/19/2023] [Indexed: 01/26/2023] Open
Abstract
Colorectal cancer (CRC) is a leading cause of cancer-related deaths worldwide. The best method to prevent CRC is with a colonoscopy. During this procedure, the gastroenterologist searches for polyps. However, there is a potential risk of polyps being missed by the gastroenterologist. Automated detection of polyps helps to assist the gastroenterologist during a colonoscopy. There are already publications examining the problem of polyp detection in the literature. Nevertheless, most of these systems are only used in the research context and are not implemented for clinical application. Therefore, we introduce the first fully open-source automated polyp-detection system scoring best on current benchmark data and implementing it ready for clinical application. To create the polyp-detection system (ENDOMIND-Advanced), we combined our own collected data from different hospitals and practices in Germany with open-source datasets to create a dataset with over 500,000 annotated images. ENDOMIND-Advanced leverages a post-processing technique based on video detection to work in real-time with a stream of images. It is integrated into a prototype ready for application in clinical interventions. We achieve better performance compared to the best system in the literature and score a F1-score of 90.24% on the open-source CVC-VideoClinicDB benchmark.
Collapse
Affiliation(s)
- Adrian Krenzer
- Department of Artificial Intelligence and Knowledge Systems, Julius-Maximilians University of Würzburg, Sanderring 2, 97070 Würzburg, Germany
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
| | - Michael Banck
- Department of Artificial Intelligence and Knowledge Systems, Julius-Maximilians University of Würzburg, Sanderring 2, 97070 Würzburg, Germany
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
| | - Kevin Makowski
- Department of Artificial Intelligence and Knowledge Systems, Julius-Maximilians University of Würzburg, Sanderring 2, 97070 Würzburg, Germany
| | - Amar Hekalo
- Department of Artificial Intelligence and Knowledge Systems, Julius-Maximilians University of Würzburg, Sanderring 2, 97070 Würzburg, Germany
| | - Daniel Fitting
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
| | - Joel Troya
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
| | - Boban Sudarevic
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
- Department of Internal Medicine and Gastroenterology, Katharinenhospital, Kriegsbergstrasse 60, 70174 Stuttgart, Germany
| | - Wolfgang G Zoller
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
- Department of Internal Medicine and Gastroenterology, Katharinenhospital, Kriegsbergstrasse 60, 70174 Stuttgart, Germany
| | - Alexander Hann
- Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine II, University Hospital Würzburg, Oberdürrbacher Straße 6, 97080 Würzburg, Germany
| | - Frank Puppe
- Department of Artificial Intelligence and Knowledge Systems, Julius-Maximilians University of Würzburg, Sanderring 2, 97070 Würzburg, Germany
| |
Collapse
|
7
|
Galati JS, Duve RJ, O'Mara M, Gross SA. Artificial intelligence in gastroenterology: A narrative review. Artif Intell Gastroenterol 2022; 3:117-141. [DOI: 10.35712/aig.v3.i5.117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/09/2022] [Revised: 11/21/2022] [Accepted: 12/21/2022] [Indexed: 12/28/2022] Open
Abstract
Artificial intelligence (AI) is a complex concept, broadly defined in medicine as the development of computer systems to perform tasks that require human intelligence. It has the capacity to revolutionize medicine by increasing efficiency, expediting data and image analysis and identifying patterns, trends and associations in large datasets. Within gastroenterology, recent research efforts have focused on using AI in esophagogastroduodenoscopy, wireless capsule endoscopy (WCE) and colonoscopy to assist in diagnosis, disease monitoring, lesion detection and therapeutic intervention. The main objective of this narrative review is to provide a comprehensive overview of the research being performed within gastroenterology on AI in esophagogastroduodenoscopy, WCE and colonoscopy.
Collapse
Affiliation(s)
- Jonathan S Galati
- Department of Medicine, NYU Langone Health, New York, NY 10016, United States
| | - Robert J Duve
- Department of Internal Medicine, Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, Buffalo, NY 14203, United States
| | - Matthew O'Mara
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| | - Seth A Gross
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| |
Collapse
|
8
|
Alkabbany I, Ali AM, Mohamed M, Elshazly SM, Farag A. An AI-Based Colonic Polyp Classifier for Colorectal Cancer Screening Using Low-Dose Abdominal CT. SENSORS (BASEL, SWITZERLAND) 2022; 22:9761. [PMID: 36560132 PMCID: PMC9782078 DOI: 10.3390/s22249761] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Revised: 11/30/2022] [Accepted: 12/02/2022] [Indexed: 06/17/2023]
Abstract
Among the non-invasive Colorectal cancer (CRC) screening approaches, Computed Tomography Colonography (CTC) and Virtual Colonoscopy (VC), are much more accurate. This work proposes an AI-based polyp detection framework for virtual colonoscopy (VC). Two main steps are addressed in this work: automatic segmentation to isolate the colon region from its background, and automatic polyp detection. Moreover, we evaluate the performance of the proposed framework on low-dose Computed Tomography (CT) scans. We build on our visualization approach, Fly-In (FI), which provides "filet"-like projections of the internal surface of the colon. The performance of the Fly-In approach confirms its ability with helping gastroenterologists, and it holds a great promise for combating CRC. In this work, these 2D projections of FI are fused with the 3D colon representation to generate new synthetic images. The synthetic images are used to train a RetinaNet model to detect polyps. The trained model has a 94% f1-score and 97% sensitivity. Furthermore, we study the effect of dose variation in CT scans on the performance of the the FI approach in polyp visualization. A simulation platform is developed for CTC visualization using FI, for regular CTC and low-dose CTC. This is accomplished using a novel AI restoration algorithm that enhances the Low-Dose CT images so that a 3D colon can be successfully reconstructed and visualized using the FI approach. Three senior board-certified radiologists evaluated the framework for the peak voltages of 30 KV, and the average relative sensitivities of the platform were 92%, whereas the 60 KV peak voltage produced average relative sensitivities of 99.5%.
Collapse
Affiliation(s)
- Islam Alkabbany
- Computer Vision and Image Processing Laboratory, University of Louisville, Louisville, KY 40292, USA
| | - Asem M. Ali
- Computer Vision and Image Processing Laboratory, University of Louisville, Louisville, KY 40292, USA
| | - Mostafa Mohamed
- Computer Vision and Image Processing Laboratory, University of Louisville, Louisville, KY 40292, USA
| | | | - Aly Farag
- Computer Vision and Image Processing Laboratory, University of Louisville, Louisville, KY 40292, USA
| |
Collapse
|
9
|
Fitting D, Krenzer A, Troya J, Banck M, Sudarevic B, Brand M, Böck W, Zoller WG, Rösch T, Puppe F, Meining A, Hann A. A video based benchmark data set (ENDOTEST) to evaluate computer-aided polyp detection systems. Scand J Gastroenterol 2022; 57:1397-1403. [PMID: 35701020 DOI: 10.1080/00365521.2022.2085059] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
BACKGROUND AND AIMS Computer-aided polyp detection (CADe) may become a standard for polyp detection during colonoscopy. Several systems are already commercially available. We report on a video-based benchmark technique for the first preclinical assessment of such systems before comparative randomized trials are to be undertaken. Additionally, we compare a commercially available CADe system with our newly developed one. METHODS ENDOTEST consisted in the combination of two datasets. The validation dataset contained 48 video-snippets with 22,856 manually annotated images of which 53.2% contained polyps. The performance dataset contained 10 full-length screening colonoscopies with 230,898 manually annotated images of which 15.8% contained a polyp. Assessment parameters were accuracy for polyp detection and time delay to first polyp detection after polyp appearance (FDT). Two CADe systems were assessed: a commercial CADe system (GI-Genius, Medtronic), and a self-developed new system (ENDOMIND). The latter being a convolutional neuronal network trained on 194,983 manually labeled images extracted from colonoscopy videos recorded in mainly six different gastroenterologic practices. RESULTS On the ENDOTEST, both CADe systems detected all polyps in at least one image. The per-frame sensitivity and specificity in full colonoscopies was 48.1% and 93.7%, respectively for GI-Genius; and 54% and 92.7%, respectively for ENDOMIND. Median FDT of ENDOMIND with 217 ms (Inter-Quartile Range(IQR)8-1533) was significantly faster than GI-Genius with 1050 ms (IQR 358-2767, p = 0.003). CONCLUSIONS Our benchmark ENDOTEST may be helpful for preclinical testing of new CADe devices. There seems to be a correlation between a shorter FDT with a higher sensitivity and a lower specificity for polyp detection.
Collapse
Affiliation(s)
- Daniel Fitting
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany
| | - Adrian Krenzer
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany.,Artificial Intelligence and Knowledge Systems, Institute for Computer Science, Julius-Maximilians-Universität, Würzburg, Germany
| | - Joel Troya
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany
| | - Michael Banck
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany.,Artificial Intelligence and Knowledge Systems, Institute for Computer Science, Julius-Maximilians-Universität, Würzburg, Germany
| | - Boban Sudarevic
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany.,Department of Internal Medicine and Gastroenterology, Katharinenhospital, Stuttgart, Germany
| | - Markus Brand
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany
| | | | - Wolfram G Zoller
- Department of Internal Medicine and Gastroenterology, Katharinenhospital, Stuttgart, Germany
| | - Thomas Rösch
- Department of Interdisciplinary Endoscopy, University Hospital Hamburg-Eppendorf, Hamburg, Germany
| | - Frank Puppe
- Artificial Intelligence and Knowledge Systems, Institute for Computer Science, Julius-Maximilians-Universität, Würzburg, Germany
| | - Alexander Meining
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany
| | - Alexander Hann
- Interventional and Experimental Endoscopy (InExEn), Internal Medicine II, University Hospital Wuerzburg, Würzburg, Germany
| |
Collapse
|
10
|
Wang A, Xiu X, Liu S, Qian Q, Wu S. Characteristics of Artificial Intelligence Clinical Trials in the Field of Healthcare: A Cross-Sectional Study on ClinicalTrials.gov. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:13691. [PMID: 36294269 PMCID: PMC9602501 DOI: 10.3390/ijerph192013691] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/08/2022] [Revised: 10/13/2022] [Accepted: 10/20/2022] [Indexed: 06/16/2023]
Abstract
Artificial intelligence (AI) has driven innovative transformation in healthcare service patterns, despite a lack of understanding of its performance in clinical practice. We conducted a cross-sectional analysis of AI-related trials in healthcare based on ClinicalTrials.gov, intending to investigate the trial characteristics and AI's development status. Additionally, the Neo4j graph database and visualization technology were employed to construct an AI technology application graph, achieving a visual representation and analysis of research hotspots in healthcare AI. A total of 1725 eligible trials that were registered in ClinicalTrials.gov up to 31 March 2022 were included in this study. The number of trial registrations has dramatically grown each year since 2016. However, the AI-related trials had some design drawbacks and problems with poor-quality result reporting. The proportion of trials with prospective and randomized designs was insufficient, and most studies did not report results upon completion. Currently, most healthcare AI application studies are based on data-driven learning algorithms, covering various disease areas and healthcare scenarios. As few studies have publicly reported results on ClinicalTrials.gov, there is not enough evidence to support an assessment of AI's actual performance. The widespread implementation of AI technology in healthcare still faces many challenges and requires more high-quality prospective clinical validation.
Collapse
Affiliation(s)
| | | | | | | | - Sizhu Wu
- Correspondence: ; Tel.: +86-10-5232-8760
| |
Collapse
|
11
|
Ahmad OF, González-Bueno Puyal J, Brandao P, Kader R, Abbasi F, Hussein M, Haidry RJ, Toth D, Mountney P, Seward E, Vega R, Stoyanov D, Lovat LB. Performance of artificial intelligence for detection of subtle and advanced colorectal neoplasia. Dig Endosc 2022; 34:862-869. [PMID: 34748665 DOI: 10.1111/den.14187] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/10/2021] [Revised: 10/22/2021] [Accepted: 11/05/2021] [Indexed: 02/05/2023]
Abstract
OBJECTIVES There is uncertainty regarding the efficacy of artificial intelligence (AI) software to detect advanced subtle neoplasia, particularly flat lesions and sessile serrated lesions (SSLs), due to low prevalence in testing datasets and prospective trials. This has been highlighted as a top research priority for the field. METHODS An AI algorithm was evaluated on four video test datasets containing 173 polyps (35,114 polyp-positive frames and 634,988 polyp-negative frames) specifically enriched with flat lesions and SSLs, including a challenging dataset containing subtle advanced neoplasia. The challenging dataset was also evaluated by eight endoscopists (four independent, four trainees, according to the Joint Advisory Group on gastrointestinal endoscopy [JAG] standards in the UK). RESULTS In the first two video datasets, the algorithm achieved per-polyp sensitivities of 100% and 98.9%. Per-frame sensitivities were 84.1% and 85.2%. In the subtle dataset, the algorithm detected a significantly higher number of polyps (P < 0.0001), compared to JAG-independent and trainee endoscopists, achieving per-polyp sensitivities of 79.5%, 37.2% and 11.5%, respectively. Furthermore, when considering subtle polyps detected by both the algorithm and at least one endoscopist, the AI detected polyps significantly faster on average. CONCLUSIONS The AI based algorithm achieved high per-polyp sensitivities for advanced colorectal neoplasia, including flat lesions and SSLs, outperforming both JAG independent and trainees on a very challenging dataset containing subtle lesions that could have been overlooked easily and contribute to interval colorectal cancer. Further prospective trials should evaluate AI to detect subtle advanced neoplasia in higher risk populations for colorectal cancer.
Collapse
Affiliation(s)
- Omer F Ahmad
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Division of Surgery and Interventional Sciences, University College London, London, UK
- Gastrointestinal Services, University College London Hospital, London, UK
| | - Juana González-Bueno Puyal
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Odin Vision Ltd, London, UK
| | - Patrick Brandao
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Odin Vision Ltd, London, UK
| | - Rawen Kader
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Division of Surgery and Interventional Sciences, University College London, London, UK
| | - Faisal Abbasi
- Division of Surgery and Interventional Sciences, University College London, London, UK
| | - Mohamed Hussein
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Division of Surgery and Interventional Sciences, University College London, London, UK
| | - Rehan J Haidry
- Division of Surgery and Interventional Sciences, University College London, London, UK
- Gastrointestinal Services, University College London Hospital, London, UK
| | | | | | - Ed Seward
- Gastrointestinal Services, University College London Hospital, London, UK
| | - Roser Vega
- Gastrointestinal Services, University College London Hospital, London, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
| | - Laurence B Lovat
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences, University College London, London
- Division of Surgery and Interventional Sciences, University College London, London, UK
- Gastrointestinal Services, University College London Hospital, London, UK
| |
Collapse
|
12
|
Mori Y, Misawa M, Kudo S. Challenges in artificial intelligence for polyp detection. Dig Endosc 2022; 34:870-871. [PMID: 35318734 PMCID: PMC9314935 DOI: 10.1111/den.14279] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 02/14/2022] [Accepted: 02/21/2022] [Indexed: 02/08/2023]
Affiliation(s)
- Yuichi Mori
- Clinical Effectiveness Research GroupInstitute of Health and SocietyUniversity of OsloOsloNorway,Section for GastroenterologyDepartment of Transplantation MedicineOslo University HospitalOsloNorway,Digestive Disease CenterShowa University Northern Yokohama HospitalKanagawaJapan
| | - Masashi Misawa
- Digestive Disease CenterShowa University Northern Yokohama HospitalKanagawaJapan
| | - Shin‐ei Kudo
- Digestive Disease CenterShowa University Northern Yokohama HospitalKanagawaJapan
| |
Collapse
|
13
|
Li JW, Wang LM, Ang TL. Artificial intelligence-assisted colonoscopy: a narrative review of current data and clinical applications. Singapore Med J 2022; 63:118-124. [PMID: 35509251 PMCID: PMC9251247 DOI: 10.11622/smedj.2022044] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/22/2023]
Abstract
Colonoscopy is the reference standard procedure for the prevention and diagnosis of colorectal cancer, which is a leading cause of cancer-related deaths in Singapore. Artificial intelligence systems are automated, objective and reproducible. Artificial intelligence-assisted colonoscopy has recently been introduced into clinical practice as a clinical decision support tool. This review article provides a summary of the current published data and discusses ongoing research and current clinical applications of artificial intelligence-assisted colonoscopy.
Collapse
Affiliation(s)
- James Weiquan Li
- Department of Gastroenterology and Hepatology, Changi General Hospital, Singapore
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
- SingHealth Duke-NUS Medicine Academic Clinical Programme, Singapore
| | - Lai Mun Wang
- Pathology Section, Department of Laboratory Medicine, Changi General Hospital, Singapore
- SingHealth Duke-NUS Pathology Academic Clinical Programme, Singapore
| | - Tiing Leong Ang
- Department of Gastroenterology and Hepatology, Changi General Hospital, Singapore
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
- SingHealth Duke-NUS Medicine Academic Clinical Programme, Singapore
| |
Collapse
|