Abstract
Monte Carlo Simulation is used to compare the performance of the Back-Propagation, Conjugate-Gradient, and Finite-Difference algorithms when training simple Multilayer Perceptron networks to solve pattern recognition and bit counting problems. Twelve individual simulations will be run for each training algorithm-test problem combination, resulting in an overall total of 72 simulations. The random elements in each Monte Carlo simulation are to be the individual synaptic weights between layers, which will be uniformly distributed. Two other factors, the size of the hidden layer and the exponent of the error function, will also be tested within the simulation plan outlined above.
Library of Congress Subject Headings
Machine learning; Monte Carlo method; Pattern recognition systems; Back propagation (Artificial intelligence)--Evaluation; Conjugate gradient methods--Evaluation; Finite differences
Publication Date
2011
Document Type
Dissertation
Student Type
Graduate
Department, Program, or Center
School of Mathematical Sciences (COS)
Advisor
Engel, Alejandro
Recommended Citation
Wehry, Stephen, "Monte Carlo comparison of back-propagation, conjugate-gradient, and finite-difference training algorithms for multilayer perceptrons" (2011). Thesis. Rochester Institute of Technology. Accessed from
https://repository.rit.edu/theses/223
Campus
RIT – Main Campus
Comments
Note: imported from RIT’s Digital Media Library running on DSpace to RIT Scholar Works. Physical copy available through RIT's The Wallace Library at: Q325.5 .W34 2011