Constant-time neural decoders for some BCH codes

Yuen Hsien Tseng, Ja Ling Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

High-order neural networks (HONN) are shown to decode some BCH codes in constant-time with very low hardware complexity. HONN is a direct extension of the linear perceptron: it uses a polynomial consisting of a set of product terms as its discriminant function. Because a product term is isomorphic to a parity function and a two-layer perceptron for the parity function has been shown by Rumelhart, Hinton, and Williams (1986), HONN has a simple realization if it is considered as having a set of parity networks in the first-half layer, followed by a linear perceptron in the second-half layer. The main problem in using high-order neural networks for a specific application is to decide a proper set of product terms. We apply genetic algorithms to this structure-adaptation problem.

Original languageEnglish
Title of host publicationProceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994
Number of pages1
DOIs
Publication statusPublished - 1994 Dec 1
Event1994 IEEE International Symposium on Information Theory, ISIT 1994 - Trondheim, Norway
Duration: 1994 Jun 271994 Jul 1

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8095

Other

Other1994 IEEE International Symposium on Information Theory, ISIT 1994
CountryNorway
CityTrondheim
Period94/6/2794/7/1

Fingerprint

High-order Neural Networks
BCH Codes
Time Constant
Perceptron
Parity
Neural networks
Term
Discriminant Function
Decode
Isomorphic
Genetic Algorithm
Hardware
Polynomial
Genetic algorithms
Polynomials

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modelling and Simulation
  • Applied Mathematics

Cite this

Tseng, Y. H., & Wu, J. L. (1994). Constant-time neural decoders for some BCH codes. In Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994 [394675] (IEEE International Symposium on Information Theory - Proceedings). https://doi.org/10.1109/ISIT.1994.394675

Constant-time neural decoders for some BCH codes. / Tseng, Yuen Hsien; Wu, Ja Ling.

Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994. 1994. 394675 (IEEE International Symposium on Information Theory - Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Tseng, YH & Wu, JL 1994, Constant-time neural decoders for some BCH codes. in Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994., 394675, IEEE International Symposium on Information Theory - Proceedings, 1994 IEEE International Symposium on Information Theory, ISIT 1994, Trondheim, Norway, 94/6/27. https://doi.org/10.1109/ISIT.1994.394675
Tseng YH, Wu JL. Constant-time neural decoders for some BCH codes. In Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994. 1994. 394675. (IEEE International Symposium on Information Theory - Proceedings). https://doi.org/10.1109/ISIT.1994.394675
Tseng, Yuen Hsien ; Wu, Ja Ling. / Constant-time neural decoders for some BCH codes. Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994. 1994. (IEEE International Symposium on Information Theory - Proceedings).
@inproceedings{feec15b134ab4771bfd83ea3adb2cc7e,
title = "Constant-time neural decoders for some BCH codes",
abstract = "High-order neural networks (HONN) are shown to decode some BCH codes in constant-time with very low hardware complexity. HONN is a direct extension of the linear perceptron: it uses a polynomial consisting of a set of product terms as its discriminant function. Because a product term is isomorphic to a parity function and a two-layer perceptron for the parity function has been shown by Rumelhart, Hinton, and Williams (1986), HONN has a simple realization if it is considered as having a set of parity networks in the first-half layer, followed by a linear perceptron in the second-half layer. The main problem in using high-order neural networks for a specific application is to decide a proper set of product terms. We apply genetic algorithms to this structure-adaptation problem.",
author = "Tseng, {Yuen Hsien} and Wu, {Ja Ling}",
year = "1994",
month = "12",
day = "1",
doi = "10.1109/ISIT.1994.394675",
language = "English",
isbn = "0780320158",
series = "IEEE International Symposium on Information Theory - Proceedings",
booktitle = "Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994",

}

TY - GEN

T1 - Constant-time neural decoders for some BCH codes

AU - Tseng, Yuen Hsien

AU - Wu, Ja Ling

PY - 1994/12/1

Y1 - 1994/12/1

N2 - High-order neural networks (HONN) are shown to decode some BCH codes in constant-time with very low hardware complexity. HONN is a direct extension of the linear perceptron: it uses a polynomial consisting of a set of product terms as its discriminant function. Because a product term is isomorphic to a parity function and a two-layer perceptron for the parity function has been shown by Rumelhart, Hinton, and Williams (1986), HONN has a simple realization if it is considered as having a set of parity networks in the first-half layer, followed by a linear perceptron in the second-half layer. The main problem in using high-order neural networks for a specific application is to decide a proper set of product terms. We apply genetic algorithms to this structure-adaptation problem.

AB - High-order neural networks (HONN) are shown to decode some BCH codes in constant-time with very low hardware complexity. HONN is a direct extension of the linear perceptron: it uses a polynomial consisting of a set of product terms as its discriminant function. Because a product term is isomorphic to a parity function and a two-layer perceptron for the parity function has been shown by Rumelhart, Hinton, and Williams (1986), HONN has a simple realization if it is considered as having a set of parity networks in the first-half layer, followed by a linear perceptron in the second-half layer. The main problem in using high-order neural networks for a specific application is to decide a proper set of product terms. We apply genetic algorithms to this structure-adaptation problem.

UR - http://www.scopus.com/inward/record.url?scp=84894379706&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84894379706&partnerID=8YFLogxK

U2 - 10.1109/ISIT.1994.394675

DO - 10.1109/ISIT.1994.394675

M3 - Conference contribution

AN - SCOPUS:84894379706

SN - 0780320158

SN - 9780780320154

T3 - IEEE International Symposium on Information Theory - Proceedings

BT - Proceedings - 1994 IEEE International Symposium on Information Theory, ISIT 1994

ER -