Skip to content

Multi-objective Simulated Annealing for Hyper-parameter Optimization in Convolutional Neural Networks

Notifications You must be signed in to change notification settings

zekikus/MOSA-cnn-hyperparams-optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-objective Simulated Annealing for Hyper-parameter Optimization in Convolutional Neural Networks

Multi-objective Simulated Annealing for Hyper-parameter Optimization in Convolutional Neural Networks

Ayla Gülcü (@aylagulcu), Zeki Kuş Dept. of Computer Science, Fatih Sultan Mehmet University, Istanbul, Turkey

This repository contains code for the paper: Multi-objective simulated annealing for hyper-parameter optimization in convolutional neural networks

Citation

If you find our code useful, please consider citing our work using the bibtex:

@article{gulcu2021multi,
  title={Multi-objective simulated annealing for hyper-parameter optimization in convolutional neural networks},
  author={G{\"u}lc{\"u}, Ayla and Ku{\c{s}}, Zeki},
  journal={PeerJ Computer Science},
  volume={7},
  pages={e338},
  year={2021},
  publisher={PeerJ Inc.}
}

Enviroment

  • Python3
  • Keras

Overview

In this study, we model a CNN hyper-parameter optimization problem as a bi-criteria optimization problem, where the first objective being the classification accuracy and the second objective being the computational complexity which is measured in terms of the number of floating point operations. For this bi-criteria optimization problem, we develop a Multi-Objective Simulated Annealing (MOSA) algorithm for obtaining high-quality solutions in terms of both objectives. CIFAR-10 is selected as the benchmark dataset, and the MOSA trade-off fronts obtained for this dataset are compared to the fronts generated by a single-objective Simulated Annealing (SA) algorithm with respect to several front evaluation metrics such as generational distance, spacing and spread. The comparison results suggest that the MOSA algorithm is able to search the objective space more effectively than the SA method. For each of these methods, some front solutions are selected for longer training in order to see their actual performance on the original test set. Again, the results state that the MOSA performs better than the SA under multi-objective setting. The performance of the MOSA configurations are also compared to other search generated and human designed state-of-the-art architectures. It is shown that the network configurations generated by the MOSA are not dominated by those architectures, and the proposed method can be of great use when the computational complexity is as important as the test accuracy.

Figure: Visual comparison of MOSA and RS search ability in terms of objective space distribution and the Pareto fronts with (A) random seed: 10, (B) random seed: 20, (C) random seed:30.

Figure: Visual comparison of MOSA and SA search ability in terms of objective space distribution and the Pareto fronts with (A) random seed: 10, (B) random seed: 20, (C) random seed:30.

Table: Comparison of MOSA architectures to other search generated architectures.,

About

Multi-objective Simulated Annealing for Hyper-parameter Optimization in Convolutional Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published