Neural Architecture Search (NAS)
Automated methods for discovering optimal neural network architectures, using search algorithms to explore the space of possible designs.
How It Works
NAS defines a search space of possible architectures (layer types, connections, sizes), a search strategy (reinforcement learning, evolutionary algorithms, gradient-based methods), and an evaluation method for candidate architectures.
Results
NAS has discovered architectures that match or exceed human-designed networks. EfficientNet, NASNet, and AmoebaNet were all found through automated search. However, the search process itself can be extremely expensive.
Practical Impact
Most practitioners use established architectures (ResNet, transformer) rather than running NAS. The primary impact has been in discovering efficient mobile architectures and understanding the design space of neural networks.