W. Yu and X. Li (Mexico)
Recurrent fuzzy neural networks, stable learning, discretetime, identification
In general, fuzzy neural networks cannot match nonlinear systems exactly. Unmodeled dynamic can lead parameters drift and even instability problem. According to system identification theory [8], robust modification terms must be contained, in order to guarantee Lyapunov stability. In this paper input-to-state stability is applied to access ro bust training algorithm for recurrent fuzzy neural model ing. Both Mamdani-type and Takagi-Sugeno-Kang type fuzzy neural networks are discussed. Stable learning algorithms for the premise part and the consequence part of fuzzy rules are proposed. We state that normal gradient decent law and backpropagation algorithm with time-varying learning rates are stable in the sense of L∞.
Important Links:
Go Back