↓ Skip to main content

Automatic anatomical classification of esophagogastroduodenoscopy images using deep convolutional neural networks

Overview of attention for article published in Scientific Reports, May 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
2 X users

Citations

dimensions_citation
117 Dimensions

Readers on

mendeley
104 Mendeley
Title
Automatic anatomical classification of esophagogastroduodenoscopy images using deep convolutional neural networks
Published in
Scientific Reports, May 2018
DOI 10.1038/s41598-018-25842-6
Pubmed ID
Authors

Hirotoshi Takiyama, Tsuyoshi Ozawa, Soichiro Ishihara, Mitsuhiro Fujishiro, Satoki Shichijo, Shuhei Nomura, Motoi Miura, Tomohiro Tada

Abstract

The use of convolutional neural networks (CNNs) has dramatically advanced our ability to recognize images with machine learning methods. We aimed to construct a CNN that could recognize the anatomical location of esophagogastroduodenoscopy (EGD) images in an appropriate manner. A CNN-based diagnostic program was constructed based on GoogLeNet architecture, and was trained with 27,335 EGD images that were categorized into four major anatomical locations (larynx, esophagus, stomach and duodenum) and three subsequent sub-classifications for stomach images (upper, middle, and lower regions). The performance of the CNN was evaluated in an independent validation set of 17,081 EGD images by drawing receiver operating characteristics (ROC) curves and calculating the area under the curves (AUCs). ROC curves showed high performance of the trained CNN to classify the anatomical location of EGD images with AUCs of 1.00 for larynx and esophagus images, and 0.99 for stomach and duodenum images. Furthermore, the trained CNN could recognize specific anatomical locations within the stomach, with AUCs of 0.99 for the upper, middle, and lower stomach. In conclusion, the trained CNN showed robust performance in its ability to recognize the anatomical location of EGD images, highlighting its significant potential for future application as a computer-aided EGD diagnostic system.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 104 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 104 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 16 15%
Student > Ph. D. Student 15 14%
Student > Bachelor 14 13%
Student > Master 11 11%
Student > Postgraduate 5 5%
Other 15 14%
Unknown 28 27%
Readers by discipline Count As %
Computer Science 24 23%
Medicine and Dentistry 21 20%
Engineering 11 11%
Agricultural and Biological Sciences 5 5%
Nursing and Health Professions 3 3%
Other 6 6%
Unknown 34 33%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 15 May 2018.
All research outputs
#14,985,360
of 23,052,509 outputs
Outputs from Scientific Reports
#73,149
of 124,556 outputs
Outputs of similar age
#197,151
of 326,851 outputs
Outputs of similar age from Scientific Reports
#1,997
of 3,313 outputs
Altmetric has tracked 23,052,509 research outputs across all sources so far. This one is in the 32nd percentile – i.e., 32% of other outputs scored the same or lower than it.
So far Altmetric has tracked 124,556 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 18.2. This one is in the 37th percentile – i.e., 37% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 326,851 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 36th percentile – i.e., 36% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 3,313 others from the same source and published within six weeks on either side of this one. This one is in the 36th percentile – i.e., 36% of its contemporaries scored the same or lower than it.