Skip to Main Content
Skip Nav Destination
ASME Press Select Proceedings
Intelligent Engineering Systems through Artificial Neural Networks, Volume 16Available to Purchase
Editor
Cihan H. Dagli
Cihan H. Dagli
Search for other works by this author on:
Anna L. Buczak
Anna L. Buczak
Search for other works by this author on:
David L. Enke
David L. Enke
Search for other works by this author on:
Mark Embrechts
Mark Embrechts
Search for other works by this author on:
Okan Ersoy
Okan Ersoy
Search for other works by this author on:
ISBN-10:
0791802566
No. of Pages:
1000
Publisher:
ASME Press
Publication date:
2006

Adaptive Resonance Theory (ART) neural networks are a popular class of neural network classifiers, and they are based on the adaptive resonance theory, developed by Grossberg. ART neural networks have a number of desirable features, such as guaranteed convergence to a solution, on-line learning capabilities, identification of novel inputs, offering an explanation for the answers that they produce, and finally, achieving good performance on a number of classification problems in a variety of application areas. Two members of the class of ART classifiers that have been introduced into the literature are Gaussian ARTMAP (GAM) and Distributed Gaussian ARTMAP (dGAM). The difference between dGAM and GAM is that in its learning phase, dGAM allows more than one ART node to learn the input pattern, contrary to GAM which allows only one ART node to learn the input pattern (winner-take-all ART network. The inventors of dGAM claimed that dGAM addresses the category proliferation problem, observed by many winner-take-all ART networks, such as Gaussian ARTMAP, Fuzzy ARTMAP, Ellipsoidal ARTMAP, amongst others. The category proliferation problem is the problem where an ART network, in the process of learning the required classification task, creates more than necessary ART nodes. This category proliferation problem is more acute when the ART networks are faced with noisy and or significantly overlapping data. However the claim, that dGAM outperforms GAM by creating smaller ART networks, has not been substantiated in the literature. In this paper, a thorough experimentation and comparison of the performance of Gaussian ARTMAP (GAM) and distributed Gaussian ARTMAP (dGAM) is provided. In the process of doing so, a new measure of performance for a neural network is introduced. This measure relies on two factors of goodness of the neural network: the network's size, and the network's generalization performance (i.e., performance of the trained ART classifier on unseen data). Obviously, a small size ART network of good generalization performance is desired. Previous comparisons of ART-like classifiers relied on a trial and error procedure (that is a time consuming and occasionally unreliable procedure) to produce a good performing ART network. The proposed measure of performance allows one to come up with a good ART network through an automated and reliable process.

Abstract
Introduction
Gaussian Artmap Architectures
Experiments
Conclusions
Acknowledgments
References
This content is only available via PDF.
You do not currently have access to this chapter.

or Create an Account

Close Modal
Close Modal