An Improved Back-Propagation Neural Networks using a Modified Non-linear Function

M.A. Otair and W.A. Salameh (Jordan)


Neural Networks, Backpropagation, Modified backprpoagation, Non-Linear function, Optical Algorithm. Based on the error signal received, connection weights are then updated by each unit to cause th


Perhaps the backpropgation neural networks (BP) is the most widely used architecture in the filed of artificial neural networks, but still suffers from different difficulties. These would burden the complete success of this paradigm. However, in this paper we tried to overcome some of these problems, particularly, these that are concerned with the learning convergence rates, number of iterations, error ratios, and others. We have used a new proposed learning algorithm based on Bp. We have noticed that using a modified non-linear mapping function would enhance the whole process in terms of learning speed, convergence. Comparing our approach with the classical one, somebody, will notice that the enhancement is, really, noticeable. The new approach has been applied on different neural network architectures, and has shown tangible improvements. The Backpropagation BP learns a predefined set of output example pairs by using a twophase propagate adapts cycle. After an input pattern has been applied as a stimulus to first layer of network units, it is propagated through each upper layer until an output is generated. This output pattern is then compared to the desired output, and an error signal is computed for each output unit. The signals are then transmitted backward from the output layer to each unit in the intermediate layer that contributes directly to the output. However, each unit in the intermediate layer receives only a portion of the total error signal, based roughly on the relative contribution the unit made to the original output. This process repeats, layer by layer, until each unit in the network has received an error signal that describes its relative contribution to the total error.

Important Links:

Go Back