A pruning structure of self-organizing HCMAC neural network classifier

Chih Ming Chen, Chin Ming Hong, Yung Feng Lu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

A self-organizing HCMAC neural network was proposed to solve high dimensional pattern classification problems well in our previous work. However, a large amount of redundant GCMAC nodes might be constructed due to the expansion approach of full binary tree topology. Therefore, this study presents a pruning structure of self-organizing HCMAC neural network to solve this problem. Experimental results show the proposed pruning structure not only can largely reduce memory requirement, but also keep fast training speed and has higher pattern classification accuracy rate than the original self-organizing HCMAC neural network does in the most testing benchmark data sets.

Original languageEnglish
Title of host publication2004 IEEE International Joint Conference on Neural Networks - Proceedings
Pages861-866
Number of pages6
DOIs
Publication statusPublished - 2004 Dec 1
Event2004 IEEE International Joint Conference on Neural Networks - Proceedings - Budapest, Hungary
Duration: 2004 Jul 252004 Jul 29

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
Volume2
ISSN (Print)1098-7576

Other

Other2004 IEEE International Joint Conference on Neural Networks - Proceedings
CountryHungary
CityBudapest
Period04/7/2504/7/29

Fingerprint

Classifiers
Neural networks
Pattern recognition
Binary trees
Topology
Data storage equipment
Testing

ASJC Scopus subject areas

  • Software

Cite this

Chen, C. M., Hong, C. M., & Lu, Y. F. (2004). A pruning structure of self-organizing HCMAC neural network classifier. In 2004 IEEE International Joint Conference on Neural Networks - Proceedings (pp. 861-866). (IEEE International Conference on Neural Networks - Conference Proceedings; Vol. 2). https://doi.org/10.1109/IJCNN.2004.1380042

A pruning structure of self-organizing HCMAC neural network classifier. / Chen, Chih Ming; Hong, Chin Ming; Lu, Yung Feng.

2004 IEEE International Joint Conference on Neural Networks - Proceedings. 2004. p. 861-866 (IEEE International Conference on Neural Networks - Conference Proceedings; Vol. 2).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Chen, CM, Hong, CM & Lu, YF 2004, A pruning structure of self-organizing HCMAC neural network classifier. in 2004 IEEE International Joint Conference on Neural Networks - Proceedings. IEEE International Conference on Neural Networks - Conference Proceedings, vol. 2, pp. 861-866, 2004 IEEE International Joint Conference on Neural Networks - Proceedings, Budapest, Hungary, 04/7/25. https://doi.org/10.1109/IJCNN.2004.1380042
Chen CM, Hong CM, Lu YF. A pruning structure of self-organizing HCMAC neural network classifier. In 2004 IEEE International Joint Conference on Neural Networks - Proceedings. 2004. p. 861-866. (IEEE International Conference on Neural Networks - Conference Proceedings). https://doi.org/10.1109/IJCNN.2004.1380042
Chen, Chih Ming ; Hong, Chin Ming ; Lu, Yung Feng. / A pruning structure of self-organizing HCMAC neural network classifier. 2004 IEEE International Joint Conference on Neural Networks - Proceedings. 2004. pp. 861-866 (IEEE International Conference on Neural Networks - Conference Proceedings).
@inproceedings{fb51ab9539ba4097add13c6cef73c5c4,
title = "A pruning structure of self-organizing HCMAC neural network classifier",
abstract = "A self-organizing HCMAC neural network was proposed to solve high dimensional pattern classification problems well in our previous work. However, a large amount of redundant GCMAC nodes might be constructed due to the expansion approach of full binary tree topology. Therefore, this study presents a pruning structure of self-organizing HCMAC neural network to solve this problem. Experimental results show the proposed pruning structure not only can largely reduce memory requirement, but also keep fast training speed and has higher pattern classification accuracy rate than the original self-organizing HCMAC neural network does in the most testing benchmark data sets.",
author = "Chen, {Chih Ming} and Hong, {Chin Ming} and Lu, {Yung Feng}",
year = "2004",
month = "12",
day = "1",
doi = "10.1109/IJCNN.2004.1380042",
language = "English",
isbn = "0780383591",
series = "IEEE International Conference on Neural Networks - Conference Proceedings",
pages = "861--866",
booktitle = "2004 IEEE International Joint Conference on Neural Networks - Proceedings",

}

TY - GEN

T1 - A pruning structure of self-organizing HCMAC neural network classifier

AU - Chen, Chih Ming

AU - Hong, Chin Ming

AU - Lu, Yung Feng

PY - 2004/12/1

Y1 - 2004/12/1

N2 - A self-organizing HCMAC neural network was proposed to solve high dimensional pattern classification problems well in our previous work. However, a large amount of redundant GCMAC nodes might be constructed due to the expansion approach of full binary tree topology. Therefore, this study presents a pruning structure of self-organizing HCMAC neural network to solve this problem. Experimental results show the proposed pruning structure not only can largely reduce memory requirement, but also keep fast training speed and has higher pattern classification accuracy rate than the original self-organizing HCMAC neural network does in the most testing benchmark data sets.

AB - A self-organizing HCMAC neural network was proposed to solve high dimensional pattern classification problems well in our previous work. However, a large amount of redundant GCMAC nodes might be constructed due to the expansion approach of full binary tree topology. Therefore, this study presents a pruning structure of self-organizing HCMAC neural network to solve this problem. Experimental results show the proposed pruning structure not only can largely reduce memory requirement, but also keep fast training speed and has higher pattern classification accuracy rate than the original self-organizing HCMAC neural network does in the most testing benchmark data sets.

UR - http://www.scopus.com/inward/record.url?scp=10944243800&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=10944243800&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2004.1380042

DO - 10.1109/IJCNN.2004.1380042

M3 - Conference contribution

AN - SCOPUS:10944243800

SN - 0780383591

T3 - IEEE International Conference on Neural Networks - Conference Proceedings

SP - 861

EP - 866

BT - 2004 IEEE International Joint Conference on Neural Networks - Proceedings

ER -