Minimal structure of self-organizing HCMAC neural network classifier

Chih Ming Chen, Yung Feng Lu, Chin Ming Hong

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.

Original languageEnglish
Pages (from-to)201-228
Number of pages28
JournalNeural Processing Letters
Volume23
Issue number2
DOIs
Publication statusPublished - 2006 Apr 1

Fingerprint

Classifiers
Neural networks
Controllers
Pattern recognition
Learning
Benchmarking
Data storage equipment
Support vector machines
Binary trees
Learning systems
Websites
Topology

Keywords

  • Cerebellar Model Articulation Controller (CMAC)
  • Minimal structure of self-organzing HCMAC (MHCMAC) neural network
  • Self-organizing hierarchical CMAC (HCMAC) neural network

ASJC Scopus subject areas

  • Software
  • Neuroscience(all)
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this

Minimal structure of self-organizing HCMAC neural network classifier. / Chen, Chih Ming; Lu, Yung Feng; Hong, Chin Ming.

In: Neural Processing Letters, Vol. 23, No. 2, 01.04.2006, p. 201-228.

Research output: Contribution to journalArticle

Chen, Chih Ming ; Lu, Yung Feng ; Hong, Chin Ming. / Minimal structure of self-organizing HCMAC neural network classifier. In: Neural Processing Letters. 2006 ; Vol. 23, No. 2. pp. 201-228.
@article{ecec5372468147e7a97446b5e4f9d389,
title = "Minimal structure of self-organizing HCMAC neural network classifier",
abstract = "The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.",
keywords = "Cerebellar Model Articulation Controller (CMAC), Minimal structure of self-organzing HCMAC (MHCMAC) neural network, Self-organizing hierarchical CMAC (HCMAC) neural network",
author = "Chen, {Chih Ming} and Lu, {Yung Feng} and Hong, {Chin Ming}",
year = "2006",
month = "4",
day = "1",
doi = "10.1007/s11063-006-6277-0",
language = "English",
volume = "23",
pages = "201--228",
journal = "Neural Processing Letters",
issn = "1370-4621",
publisher = "Springer Netherlands",
number = "2",

}

TY - JOUR

T1 - Minimal structure of self-organizing HCMAC neural network classifier

AU - Chen, Chih Ming

AU - Lu, Yung Feng

AU - Hong, Chin Ming

PY - 2006/4/1

Y1 - 2006/4/1

N2 - The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.

AB - The authors previously proposed a self-organizing Hierarchical Cerebellar Model Articulation Controller (HCMAC) neural network containing a hierarchical GCMAC neural network and a self-organizing input space module to solve high-dimensional pattern classification problems. This novel neural network exhibits fast learning, a low memory requirement, automatic memory parameter determination and highly accurate high-dimensional pattern classification. However, the original architecture needs to be hierarchically expanded using a full binary tree topology to solve pattern classification problems according to the dimension of the input vectors. This approach creates many redundant GCMAC nodes when the dimension of the input vectors in the pattern classification problem does not exactly match that in the self-organizing HCMAC neural network. These redundant GCMAC nodes waste memory units and degrade the learning performance of a self-organizing HCMAC neural network. Therefore, this study presents a minimal structure of self-organizing HCMAC (MHCMAC) neural network with the same dimension of input vectors as the pattern classification problem. Additionally, this study compares the learning performance of this novel learning structure with those of the BP neural network,support vector machine (SVM), and original self-organizing HCMAC neural network in terms of ten benchmark pattern classification data sets from the UCI machine learning repository. In particular, the experimental results reveal that the self-organizing MHCMAC neural network handles high-dimensional pattern classification problems better than the BP, SVM or the original self-organizing HCMAC neural network. Moreover, the proposed self-organizing MHCMAC neural network significantly reduces the memory requirement of the original self-organizing HCMAC neural network, and has a high training speed and higher pattern classification accuracy than the original self-organizing HCMAC neural network in most testing benchmark data sets. The experimental results also show that the MHCMAC neural network learns continuous function well and is suitable for Web page classification.

KW - Cerebellar Model Articulation Controller (CMAC)

KW - Minimal structure of self-organzing HCMAC (MHCMAC) neural network

KW - Self-organizing hierarchical CMAC (HCMAC) neural network

UR - http://www.scopus.com/inward/record.url?scp=33645504803&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33645504803&partnerID=8YFLogxK

U2 - 10.1007/s11063-006-6277-0

DO - 10.1007/s11063-006-6277-0

M3 - Article

AN - SCOPUS:33645504803

VL - 23

SP - 201

EP - 228

JO - Neural Processing Letters

JF - Neural Processing Letters

SN - 1370-4621

IS - 2

ER -