Evolutionary Algorithms for Hyperparameter Optimization in Neural Networks

mlp project 2017 18 n.w
1 / 4
Embed
Share

Explore how Evolutionary Strategies (ES) and Genetic Algorithm (GA) are applied to optimize hyperparameters in neural networks for EMNIST and OMNIGLOT classification tasks. Discover the benefits of combining GA and ES for finding optimal network architectures.

  • Evolutionary Algorithms
  • Hyperparameter Optimization
  • Neural Networks
  • Genetic Algorithm
  • EMNIST

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. MLP Project 2017-18 Evolutionary Algorithms for Hyperparameter Optimization Group 52 AID Antonios Valais Ian Mauldin Dinesh Saravana Sundaram

  2. Evolutionary Algorithms: Smarter nature-inspired search process Uses fitness landscape Genetic Algorithm (GA) Evolutionary Strategies (ES)

  3. Applying ES and GA to neural networks EMNIST and OMNIGLOT classification with fully- connected networks Hyperparameters Number of hidden layers Number of neurons Activation functions Learning rules Encode hyperparameters in chromosome Train each chromosome as a different neural network architecture Fitness is performance on validation set GA Optimal Fitness (Classification Performance)

  4. Conclusions GA Global search process Have to define the initial bounds on hyperparameters Works on schemas ES Gradient based search Local search process Performance dependent on starting point Follows a gradient Can be trapped in local optima GA + ES Combines advantages and mitigates of disadvantages of both search processes Found our best network architecture for OMNIGLOT

Related


More Related Content