EFFICIENT TRAINING OF NEURAL NETWORKS USING OPTICAL BACKPROPAGATION WITH MOMENTUM FACTOR

M.A. Otair∗ and W.A. Salameh∗∗

Keywords

Neural networks, backpropagation, momentum, optical backpropa- gation, non-linear function

Abstract

Backpropagation (BP) algorithm [1], which is commonly used in training multilayer neural networks might take long time to converge in many cases. However, momentum is a standard method that is used to speed up convergence and escape from local minima. This paper presents a new technique called, an optical backpropagation algorithm (OBP) [2] with momentum factor that tries to speed up the training process. This has been demonstrated through a comparison analysis between the proposed one and similar one in the literature such as BP, BPM, Quickprop and Delta-bar-Delta. The efficiency of the proposed algorithm is shown through experiments on two training problems: XOR, and character recognition. The efficiency results for the proposed algorithm show its superiority over the existing ones. Therefore, the promising results that obtained may be used in different real areas of classifications.

Important Links:



Go Back