An optimal dimension expansion procedure for obtaining linearly separable subsets

Yuen-Hsien Tseng, Ja Ling Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

The authors study the necessary and sufficient condition for linearly separable subsets and then propose an optimal dimension expansion procedure that makes any mapping to be performed by perceptrons learnable by an error-correction procedure. For n-bit parity check problems, it is shown that only one additional dimension is augmented to make them solvable by single-layer perceptrons. Other applications such as for decoding error-correcting codes are also considered.

Original languageEnglish
Title of host publication91 IEEE Int Jt Conf Neural Networks IJCNN 91
PublisherPubl by IEEE
Pages2461-2465
Number of pages5
ISBN (Print)0780302273
Publication statusPublished - 1991 Dec 1
Event1991 IEEE International Joint Conference on Neural Networks - IJCNN '91 - Singapore, Singapore
Duration: 1991 Nov 181991 Nov 21

Publication series

Name91 IEEE Int Jt Conf Neural Networks IJCNN 91

Other

Other1991 IEEE International Joint Conference on Neural Networks - IJCNN '91
CitySingapore, Singapore
Period91/11/1891/11/21

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'An optimal dimension expansion procedure for obtaining linearly separable subsets'. Together they form a unique fingerprint.

Cite this