TY - GEN
T1 - Neural networks with dynamic structure using a GA-based learning method
AU - Fall, Everett
AU - Chiang, Hsin Han
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/6/1
Y1 - 2015/6/1
N2 - Artificial neural networks (NNs) are traditionally designed with distinctly defined layers (input layer, hidden layers, output layer) and accordingly network design techniques and training algorithms are based on this concept of strictly defined layers. In this paper, a new approach to designing neural networks is presented. The structure of the proposed NN is not strictly defined (each neuron may receive input from any other neuron). Instead, the initial network structure can be randomly generated, and traditional methods of training, such as back-propagation, are replaced or augmented by a genetic algorithm (GA). The weighting of each neuron input is encoded genetically to serve as the genes for the GA. By means of the training data provided to the supervised network, the contribution of each neuron in creating a desired output serves as a selection function. Each of the neurons is then modified to store and recall past weightings for possible future use. A simple network is trained to recognize vertical and horizontal lines as a proof of concept.
AB - Artificial neural networks (NNs) are traditionally designed with distinctly defined layers (input layer, hidden layers, output layer) and accordingly network design techniques and training algorithms are based on this concept of strictly defined layers. In this paper, a new approach to designing neural networks is presented. The structure of the proposed NN is not strictly defined (each neuron may receive input from any other neuron). Instead, the initial network structure can be randomly generated, and traditional methods of training, such as back-propagation, are replaced or augmented by a genetic algorithm (GA). The weighting of each neuron input is encoded genetically to serve as the genes for the GA. By means of the training data provided to the supervised network, the contribution of each neuron in creating a desired output serves as a selection function. Each of the neurons is then modified to store and recall past weightings for possible future use. A simple network is trained to recognize vertical and horizontal lines as a proof of concept.
KW - Neural network
KW - biologically inspired
KW - complex neuron
KW - deep learning
KW - genetic algorithm
KW - weight retention
UR - http://www.scopus.com/inward/record.url?scp=84941215835&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84941215835&partnerID=8YFLogxK
U2 - 10.1109/ICNSC.2015.7116001
DO - 10.1109/ICNSC.2015.7116001
M3 - Conference contribution
AN - SCOPUS:84941215835
T3 - ICNSC 2015 - 2015 IEEE 12th International Conference on Networking, Sensing and Control
SP - 7
EP - 12
BT - ICNSC 2015 - 2015 IEEE 12th International Conference on Networking, Sensing and Control
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2015 12th IEEE International Conference on Networking, Sensing and Control, ICNSC 2015
Y2 - 9 April 2015 through 11 April 2015
ER -