A NEW ROBUST WEIGHT UPDATE FOR MULTILAYER-PERCEPTRON ADAPTIVE CONTROL

C.J.B. Macnab

Keywords

Multilayer perceptrons, neural network control, direct adaptive control, flexible-joint robots, lyapunov stability, nonlinear systems

Abstract

This work addresses the problem of weight drift in direct adaptive control of underdamped systems using a multilayer perceptron (backpropagation network). When the number of hidden units in the neural network is small, the modeling error causes severe weight drift. Weight drift can lead to a control effort that chatters about the origin, which may excite the system’s natural frequency. For this type of system, the traditional robust weight update methods of e-modification and deadzone will sacrifice performance to prevent the weight drift. This paper proposes a new method that can prevent weight drift without sacrificing performance. A set of alternate weights is trained online to approximate the output of the original (control) weights. The design of the weight update law keeps the control weights from drifting too far from these alternate weights. A Lyapunov analysis proves the semi-global uniform ultimate boundedness of all signals. An experimental simulation, trajectory tracking of a two-link flexible-joint robot using the backstepping method, illustrates the improvement in performance compared to e-modification and deadzone. The new method not only achieves performance 10 times better than other methods, but also eliminates the chatter in the control as well.

Important Links:



Go Back