RLVQ Determination using OWA Operators

A. Caţaron (Romania) and R. Andonie (USA)

Keywords

machine learning, learning vector quantization, ordered weighted aggregation, feature ranking

Abstract

Relevance Learning Vector Quantization (RLVQ) (intro duced in [1]) is a variation of Learning Vector Quantiza tion (LVQ) which allows a heuristic determination of rel evance factors for the input dimensions. The method is based on Hebbian learning and defines weighting factors of the input dimensions which are automatically adapted to the specific problem. These relevance factors increase the overall performance of the LVQ algorithm. At the same time, relevances can be used for feature ranking and input dimensionality reduction. We introduce a different method for computing the relevance of the input dimensions in RLVQ. The relevances are computed on-line as Ordered Weighted Aggregation (OWA) weights. OWA operators are a family of mean type aggregation operators [2]. The principal benefit of our OWA-RLVQ algorithm is that it connects RLVQ to the mathematically consistent OWA models.

Important Links:



Go Back