Detecting masses in digital mammograms based on texture analysis and neural classifier

Guo Shiang Lin*, Yu Cheng Chang, Wei Cheng Yeh, Kai Che Liu, Chia Hung Yeh

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In the paper, we proposed a mass detection method based on texture analysis and neural classifier. The proposed mass detection method is composed of two parts: ROI selection, feature extraction, and neural classifier. ROI selection is used to reduce the computational complexity of the proposed scheme. In the texture analysis, the intensity and texture information extracted from spatial and wavelet domains are utilized to find the candidates of mass regions. These texture features are extracted and combined with a supervised neural network to be classifier. The experimental result shows that the average recall rate of our proposed scheme is more than 93%. The result demonstrates that our proposed method can achieve mass detection.

Original languageEnglish
Title of host publicationProceedings - 3rd International Conference on Information Security and Intelligent Control, ISIC 2012
Pages222-225
Number of pages4
DOIs
Publication statusPublished - 2012
Externally publishedYes
Event3rd International Conference on Information Security and Intelligent Control, ISIC 2012 - Yunlin, Taiwan
Duration: 2012 Aug 142012 Aug 16

Publication series

NameProceedings - 3rd International Conference on Information Security and Intelligent Control, ISIC 2012

Conference

Conference3rd International Conference on Information Security and Intelligent Control, ISIC 2012
Country/TerritoryTaiwan
CityYunlin
Period2012/08/142012/08/16

Keywords

  • mass detection
  • neural classifier
  • texture analysis

ASJC Scopus subject areas

  • Artificial Intelligence
  • Information Systems

Fingerprint

Dive into the research topics of 'Detecting masses in digital mammograms based on texture analysis and neural classifier'. Together they form a unique fingerprint.

Cite this