Genetic Optimization of Art Neural Network Architectures

A. Kaylani, M. Georgiopoulos, M. Mollaghasemi, and G. Anagnostopoulos (USA)


Machine Learning, Classification, ARTMAP, Genetic Al gorithms, Genetic Operators, Category Proliferation.


Adaptive Resonance Theory (ART) neural network ar chitectures, such as Fuzzy ARTMAP (FAM), Ellipsoidal ARTMAP (EAM), and Gaussian ARTMAP (GAM), have solved successfully a variety of classification problems. However, they suffer from an inherent ART problem, that of creating larger architectures than it is necessary to solve the problem at hand (referred to as the ART category prolif eration problem). This problem is especially amplified for classification problems which have noisy data, and/or data, belonging to different labels, that significantly overlap. A variety of modified ART architectures, referred to as semi supervised (ss) ART architectures (e.g., ssFAM, ssEAM, ssGAM), summarily referred to as ssART, have addressed the category proliferation problem. In this paper, we are proposing another approach of solving the ART category proliferation problem, by designing genetically engineered ART architectures, such as GFAM, GEAM, GGAM, sum marily referred to as GART. In particular, in this paper, we explain how to design GART architectures and compare their performance (in terms of accuracy, size, and compu tational complexity) with the performance of the ssART ar chitectures. Our results demonstrate that GART is superior to ssART, and quite often it produces the optimal classifier.

Important Links:

Go Back