Using Genetic Algorithms to Adapt Neuron Functional Forms

S. Narayan (USA)

Keywords

Neural Networks, Genetic Algorithms

Abstract

Multilayer Perceptron (MLP) networks are commonly ap plied to problems in prediction and classification. The use of a linear node connection function in the MLP model im plies that MLP networks are best suited to problems where the different classes can be separated by hyperplanes. The use of alternative neuron models with nonlinear node connection functions is often hampered by mathematical constraints, such as differentiability, imposed by the use of gradient descent-based methods of learning. Genetic algorithms provide a viable alternative for training MLP net works and mitigate many of the difficulties associated with traditional gradient descent-based methods. This paper examines the feasibility of using a genetic algorithm to design neural networks that employ neurons with functional forms appropriate to the problem being solved. Application to classification problems indicates that a genetic algorithm is capable of both evolving network weights and selecting appropriate functional forms for neurons in the hidden layer.

Important Links:



Go Back