Multilayer Perceptron optimization through Simulated Annealing and Fast Simulated Annealing

Authors

DOI:

https://doi.org/10.20873/ajceam.v1i2.9474

Keywords:

Neural Network, Multilayer Perceptron, Simulated Annealing, MNIST Database

Abstract

The Multilayer Perceptron (MLP) is a classic and widely used neural network model in machine learning applications. As the majority of classifiers, MLPs need well-defined parameters to produce optimized results. Generally, machine learning engineers use grid search to optimize the hyper-parameters of the models, which requires to re-train the models. In this work, we show a computational experiment using metaheuristics Simulated Annealing and Fast Simulated Annealing for optimization of MLPs in order to optimize the hyper-parameters. In the reported experiment, the model is used to optimize two parameters: the configuration of the neural network layers and its neuron weights. The experiment compares the best MLPs produced by the SA and FastSA using the accuracy and classifier complexity as comparison measures. The MLPs are optimized in order to produce a classifier for the MNIST database. The experiment showed that FastSA has produced a better MLP, using less computational time and less fitness evaluations.

Published

2020-06-10

How to Cite

[1]
Camelo, .H.C. and de Carvalho, R.L. 2020. Multilayer Perceptron optimization through Simulated Annealing and Fast Simulated Annealing. Academic Journal on Computing, Engineering and Applied Mathematics. 1, 2 (Jun. 2020), 28–31. DOI:https://doi.org/10.20873/ajceam.v1i2.9474.

Issue

Section

Brief Communication

Categories