Unveiling the Potential of Randomly Wired Networks in Neural Architecture Search

randomly wired networks are on the rise have n.w
1 / 27
Embed
Share

The emergence of randomly wired networks is challenging the traditional approaches to network design. This article delves into the realm of Neural Architecture Search (NAS) through the perspective of Cognitive Science, exploring the implications of evolving networks through operators and slow approaches like Reinforcement Learning (RL), Recurrent Neural Networks (RNN), and Genetic Algorithms. The importance of random wiring and its impact on network performance are also analyzed, shedding light on the decoupling of innate abilities from learned traits in brain wiring. Through surveys and analysis of studies like Xie et al.'s work on randomly wired neural networks, this research aims to provide insights into the future of network optimization.

  • Randomly Wired Networks
  • Neural Architecture Search
  • Cognitive Science
  • Network Evolution
  • Network Optimization

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Randomly Wired Networks are on the rise, have we been creating wrong Networks all along? CSE 598 Mudit Verma

  2. TIMELINE 1) 2) Background & Failed attempt to survey 3) Perspective of Cognitive Science 4) Analyze Xie et. al FAIR article 5) Implementation & Decide 6) Results 7) DIY Stuff Aim

  3. BACKGROUND Neural Architecture Search 1. 2. Weight Sharing 3. Evolving the network through operators (Morphism) 4. Generate & Test Slow Approaches - RL / RNN / Genetic Algorithms

  4. BACKGROUND Neural Architecture Search 1. Slow Approaches - RL / RNN / Genetic Algorithms 2. Weight Sharing 3. Evolving the network through operators (Morphism) 4. Generate & Test ~ Opposite of Intelligence

  5. AIM : Survey - Look at Neural Architecture Search from the eyes of MCMC.

  6. SURVEY MCMC Neural Architecture Search MCMC for Deep Learning ~ Generative Models etc. Random Search on NAS (Morphisms) Obtain Bayesian Network DAG from data Random Walks on Graphs, Sample Distributions

  7. SURVEY MCMC Neural Architecture Search MCMC for Deep Learning ~ Generative Models etc. Random Search on NAS morphism Obtain Bayesian Network DAG from data Random Walks on Graphs, Sample Distributions Xie et. al : Exploring Randomly Wired Neural Networks for Image Recognition

  8. AIM : Survey - Look at Neural Architecture Search from the eyes of MCMC. Gain insights about importance of random wiring Analyze Xie et. al, Agree/Disagree Report Results Write down some ideas

  9. Importance of Random Wiring Brain Wiring : Innate abilities are decoupled from the ones learnt in their lifetimes. Weight initialization has minimal effect on one-shot NAS Small-world Assumption

  10. Theme Are graphs scoring high on accuracy leaderboard random?

  11. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test

  12. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Markov Chain Swap Edge Stay Add/Remove Node Add/Remove Edge

  13. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Erdos-Reyni Graph with N Nodes Edge with Prob. P

  14. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Barabasi-Albert Model (1 M < N). M nodes without edges Add new nodes with M new edges. Graph has M (N M) edges.

  15. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Watts-Strogatz Model N nodes in a ring, each connected to (even)K/2 neighbors on both sides. Rewire with Prob P Repeated K/2 times

  16. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Add a Root and Sink In Out

  17. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test

  18. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test 1. ReLU + Conv + BN 2. Handle inChannels / outChannels 3. Operations at joints

  19. IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Experiments MNIST 24 MC Graphs 12 ER 12 BA 12 WS .. .. .. Same hyper params Varying Algorithm priors

  20. RESULTS Accuracy Curves 96.25 - 97.5 95 - 97 MC Graphs ER/BA/WS MC > Others?

  21. RESULTS DENSITY, DIAMETER, DEGREE CENTRALITY

  22. RESULTS MEAN DEGREE

  23. RESULTS VARIANCE Spread : BA > MC ~ ER > WS

  24. RESULTS

  25. Random Wiring, Verdict? Cons : Patterns in Random Graph vs Acc Other NAS are better (Use acc. to evolve) Overparameterization Pros: Shatters works claiming to contribute to architecture by wiring. graph properties and the input size -> goodness" of architecture

  26. FUTURE WORK Bootstrapping : Collect good architectures , sample graphs conditioned on these. Multi Layered Markov Chain : Decide Architecture Decide Mapping / Mapping distribution Dimension Compatibility? Slow mixing?

  27. CONCLUSION Importance of Wiring. Critique Experiments Random graphs are not equal. Claim : Architecture -> image dimensions (amt.info > info)

More Related Content