R. Kassab and J.-C. Lamirel (France)
abstraction methodology, competitive Hebbian learning, clustering, topology, neural networks
Competitive learning neural networks are powerful analyt ical tools for data clustering and topology-preserving visu alization. However, they are limited in the sense of being unable to achieve more than one task on the same network. When applied to clustering tasks, every neuron unit is sup posed to represent one of the inherent data clusters, while learning topology requires much more neuron units. The aim of this work is to figure out how connections between neurons – specially those inserted by a competitive Heb bian learning – can be exploited to construct higher abstrac tion levels of a topology-preserving neural network. The idea is to devote the basic level of the network to detailed description of the data while taking advantage of higher ab straction levels to facilitate quantitative analysis and struc tural overview of the network. The abstraction is done in two different fashions: macro and micro abstractions. In macro abstraction, the objective is to cluster the first-level neuron units into their principal groups corresponding to the clusters inherent in the data. In contrast, micro abstrac tion is designed to capture underlying clusters according to a given granularity degree. The abstraction-level units are also connected to reflect a simplified structure of the data distribution. At the end, the basic level of the network can be hierarchically organized with one or more abstraction levels, permitting both qualitative and quantitative analysis of the data. Simulations on many synthetic data shown the relevance of proposed model.
Important Links:
Go Back