Nonlinear Eigenspace Models based on Fast Statistical Learning Algorithm

Yohei Takeuchi, Momoyo Ito, and Minoru Fukumi

Keywords

Pattern Recognition, Feature Extraction, Kernel method

Abstract

In the field of pattern recognition, feature extraction plays an important role prior to classification in order to filter out the background noise, reduce the dimensionality for input and so on. Fisher Linear Discriminant Analysis (FLDA) is well-known as one of the most famous feature extraction methods. In recent years, FLDA has been improved in various ways because an eigenspace is learned faster and/or the classification performance is improved. Simple-FLDA (SFLDA) has been proposed to speed up the learning by improving FLDA algorithm. However, the above methods are calculated in input space. Thus, it might not be efficient in cases where data distribution is complex. Then, Simple Kernel Discriminant Analysis (SKDA), which is an improved version of Kernel Discriminant Analysis (KDA), has been proposed to acquire a better performance for classification by applying kernel trick. Whereas a better performance is acquired by SKDA than that by SFLDA, its learning speed has increased instead. In this paper, an additional improvement is applied to SKDA algorithm and the improved version of SKDA (SIKDA) is introduced. The performance of SIKDA is as same as that of SKDA. In addition, learning speed has become faster than that by SKDA. These are shown in the experiment, especially, the influence of proposed method has seen in a specified dataset.

Important Links:



Go Back