Skip to Main Content
Skip Nav Destination
ASME Press Select Proceedings
Intelligent Engineering Systems Through Artificial Neural Networks, Volume 17
Editor
C. H. Dagli
C. H. Dagli
Search for other works by this author on:
ISBN-10:
0791802655
No. of Pages:
650
Publisher:
ASME Press
Publication date:
2007

A multi-layer neural network with multiple hidden layers was trained as an autoencoder using steepest descent, scaled conjugate gradient and alopex algorithms. These algorithms were used in different combinations with steepest descent and alopex used as pretraining algorithms followed by training using scaled conjugate gradient. All the algorithms were also used to train the autoencoders without any pretraining. Three datasets: USPS digits, MNIST digits, and Olivetti faces were used for training. The results were compared with those of Hinton et al. (Hinton and Salakhutdinov, 2006) for MNIST and Olivetti face dataset. Results indicate that while we were able to prove that pretraining is important for obtaining good results, the pretraining approach used by Hinton et al. obtains lower RMSE than other methods. However, scaled conjugate gradient turned out to be the fastest, computationally.

Abstract
1. Introduction
2. Background
3. Results
4. Statistical Analysis
5. Discussion
6. Conclusions
References
This content is only available via PDF.
You do not currently have access to this chapter.
Close Modal

or Create an Account

Close Modal
Close Modal