Detailed Information

Cited 13 time in webofscience Cited 29 time in scopus
Metadata Downloads

A Novel Approach for Increased Convolutional Neural Network Performance in Gastric-Cancer Classification Using Endoscopic Imagesopen access

Authors
Lee, Sin-AeCho, Hyun ChinCho, Hyun-Chong
Issue Date
Mar-2021
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Keywords
Cancer; Lesions; Endoscopes; Image segmentation; Training; Image color analysis; Internet; Augmentation; computer-aided diagnosis (CADx); deep learning; gastric cancer; segmentation
Citation
IEEE ACCESS, v.9, pp 51847 - 51854
Pages
8
Indexed
SCIE
SCOPUS
Journal Title
IEEE ACCESS
Volume
9
Start Page
51847
End Page
51854
URI
https://scholarworks.gnu.ac.kr/handle/sw.gnu/72906
DOI
10.1109/ACCESS.2021.3069747
ISSN
2169-3536
Abstract
Gastric cancer is the third-most-common cause of cancer-related deaths in the world. Fortunately, it can be detected using endoscopy equipment. Computer-aided diagnosis (CADx) systems can help clinicians identify cancer from gastric diseases more accurately. In this paper, we present a CADx system that distinguishes and classifies gastric cancer from pre-cancerous conditions, such as gastric polyps, gastric ulcers, gastritis, and bleeding. The system uses a deep-learning model, Xception, which involves depth-wise separable convolutions, to classify cancer and non-cancers. The proposed method consists of two steps: Google's AutoAugment for augmentation and the simple linear iterative clustering (SLIC) superpixel and fast and robust fuzzy C-means (FRFCM) algorithm for image segmentation during preprocessing. These approaches produce a feasible method of distinguishing and classifying cancers from other gastric diseases. Based on biopsy-supported ground truth, the performance metrics of the area under the receiver operating characteristic curve (i.e. Az) are measured on the test sets. Based on the classification results, the Az of the proposed classification model is 0.96, which is 0.06 up from 0.90 which is the Az of the original data. Our methods are fully automated without the manual specification of region-of-interests for the test and with a random selection of images for model training. This methodology may play a crucial role in selecting effective treatment options without the need for a surgical biopsy.
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Altmetrics

Total Views & Downloads

BROWSE