Multilayer Perceptron optimization through Simulated Annealing and Fast Simulated Annealing

Autores

DOI:

https://doi.org/10.20873/ajceam.v1i2.9474

Palavras-chave:

Neural Network, Multilayer Perceptron, Simulated Annealing, MNIST Database

Resumo

The Multilayer Perceptron (MLP) is a classic and widely used neural network model in machine learning applications. As the majority of classifiers, MLPs need well-defined parameters to produce optimized results. Generally, machine learning engineers use grid search to optimize the hyper-parameters of the models, which requires to re-train the models. In this work, we show a computational experiment using metaheuristics Simulated Annealing and Fast Simulated Annealing for optimization of MLPs in order to optimize the hyper-parameters. In the reported experiment, the model is used to optimize two parameters: the configuration of the neural network layers and its neuron weights. The experiment compares the best MLPs produced by the SA and FastSA using the accuracy and classifier complexity as comparison measures. The MLPs are optimized in order to produce a classifier for the MNIST database. The experiment showed that FastSA has produced a better MLP, using less computational time and less fitness evaluations.

Downloads

Publicado

2020-06-10

Como Citar

[1]
Camelo, P.H.C. e de Carvalho, R.L. 2020. Multilayer Perceptron optimization through Simulated Annealing and Fast Simulated Annealing. Academic Journal on Computing, Engineering and Applied Mathematics. 1, 2 (jun. 2020), 28–31. DOI:https://doi.org/10.20873/ajceam.v1i2.9474.

Edição

Seção

Comunicação Breve

Categorias