
Unveiling the Potential of Randomly Wired Networks in Neural Architecture Search
The emergence of randomly wired networks is challenging the traditional approaches to network design. This article delves into the realm of Neural Architecture Search (NAS) through the perspective of Cognitive Science, exploring the implications of evolving networks through operators and slow approaches like Reinforcement Learning (RL), Recurrent Neural Networks (RNN), and Genetic Algorithms. The importance of random wiring and its impact on network performance are also analyzed, shedding light on the decoupling of innate abilities from learned traits in brain wiring. Through surveys and analysis of studies like Xie et al.'s work on randomly wired neural networks, this research aims to provide insights into the future of network optimization.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Randomly Wired Networks are on the rise, have we been creating wrong Networks all along? CSE 598 Mudit Verma
TIMELINE 1) 2) Background & Failed attempt to survey 3) Perspective of Cognitive Science 4) Analyze Xie et. al FAIR article 5) Implementation & Decide 6) Results 7) DIY Stuff Aim
BACKGROUND Neural Architecture Search 1. 2. Weight Sharing 3. Evolving the network through operators (Morphism) 4. Generate & Test Slow Approaches - RL / RNN / Genetic Algorithms
BACKGROUND Neural Architecture Search 1. Slow Approaches - RL / RNN / Genetic Algorithms 2. Weight Sharing 3. Evolving the network through operators (Morphism) 4. Generate & Test ~ Opposite of Intelligence
AIM : Survey - Look at Neural Architecture Search from the eyes of MCMC.
SURVEY MCMC Neural Architecture Search MCMC for Deep Learning ~ Generative Models etc. Random Search on NAS (Morphisms) Obtain Bayesian Network DAG from data Random Walks on Graphs, Sample Distributions
SURVEY MCMC Neural Architecture Search MCMC for Deep Learning ~ Generative Models etc. Random Search on NAS morphism Obtain Bayesian Network DAG from data Random Walks on Graphs, Sample Distributions Xie et. al : Exploring Randomly Wired Neural Networks for Image Recognition
AIM : Survey - Look at Neural Architecture Search from the eyes of MCMC. Gain insights about importance of random wiring Analyze Xie et. al, Agree/Disagree Report Results Write down some ideas
Importance of Random Wiring Brain Wiring : Innate abilities are decoupled from the ones learnt in their lifetimes. Weight initialization has minimal effect on one-shot NAS Small-world Assumption
Theme Are graphs scoring high on accuracy leaderboard random?
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Markov Chain Swap Edge Stay Add/Remove Node Add/Remove Edge
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Erdos-Reyni Graph with N Nodes Edge with Prob. P
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Barabasi-Albert Model (1 M < N). M nodes without edges Add new nodes with M new edges. Graph has M (N M) edges.
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Watts-Strogatz Model N nodes in a ring, each connected to (even)K/2 neighbors on both sides. Rewire with Prob P Repeated K/2 times
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Add a Root and Sink In Out
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test 1. ReLU + Conv + BN 2. Handle inChannels / outChannels 3. Operations at joints
IMPLEMENTATION : 1. Random Graph Generation (R) 2. R -> DAG 3. R -> G (Neural Network) 4. Train & Test Experiments MNIST 24 MC Graphs 12 ER 12 BA 12 WS .. .. .. Same hyper params Varying Algorithm priors
RESULTS Accuracy Curves 96.25 - 97.5 95 - 97 MC Graphs ER/BA/WS MC > Others?
RESULTS VARIANCE Spread : BA > MC ~ ER > WS
Random Wiring, Verdict? Cons : Patterns in Random Graph vs Acc Other NAS are better (Use acc. to evolve) Overparameterization Pros: Shatters works claiming to contribute to architecture by wiring. graph properties and the input size -> goodness" of architecture
FUTURE WORK Bootstrapping : Collect good architectures , sample graphs conditioned on these. Multi Layered Markov Chain : Decide Architecture Decide Mapping / Mapping distribution Dimension Compatibility? Slow mixing?
CONCLUSION Importance of Wiring. Critique Experiments Random graphs are not equal. Claim : Architecture -> image dimensions (amt.info > info)