# Neural Architecture Search
Why not learn the optimal neural architecture?
How to backprop through discrete, non-differentiable variables?
- Architectures/graphs, number of layers, number of nodes, connectivity
- Several workarounds to circumvent the problem
Should you run your own neural architecture search? If you are Facebook or Google, yes!
## Evolutionary search for NAS
Some papers:
- DARTS: Differentiable Architecture Search, Liu et at. 2018
- Efficient Neural Architecture Search via Parameter Sharing, Pham et al., 2018
- Evolving Space-Time Neura Architectures for Videos, Piergiovanni et al. 2018
- Regularized Evolution for Image Classifier Architecture Search, Real et al., 2019
## Combinatorial Bayesian Optimization for NAS
Combinatorial Bayesian Optimization using the Graph Cartesian Product, NeurIPS, 2019, C. Oh. J. Tomczak, E. Gavves, M. Welling, NeurIPS 2019.
- Treat neural architecture search as hyperparameter optimization
- Learn bayesian model of architecture space and sample likely good architectures
---
## References
1. Lecture 5.5, UvA DL course 2020