AI Glossary/Neural Architecture Search (NAS)

What Is Neural Architecture Search (NAS)?

Definition

Neural Architecture Search (NAS) is an automated machine learning technique that uses algorithms to explore and discover optimal neural network architectures for a given task, replacing the manual trial-and-error process of designing model structures.

How Neural Architecture Search (NAS) Works

Designing neural network architectures traditionally required extensive expertise and experimentation. NAS automates this by defining a search space of possible architectures (layer types, connections, hyperparameters) and using optimization algorithms — including reinforcement learning, evolutionary algorithms, or gradient-based methods — to find architectures that perform best on a target task. NAS discovered architectures like EfficientNet and NASNet that outperformed human-designed networks. However, early NAS was extremely expensive (thousands of GPU hours). Modern efficient NAS methods like DARTS reduce this cost dramatically. NAS has been influential in both computer vision and NLP, and Google used NAS to help design components of their production models.

Real-World Examples

1

Google using NAS to discover the EfficientNet architecture, which achieved better accuracy with fewer parameters than human-designed CNNs

2

A research lab running NAS to find the optimal Transformer variant for their specific language task

3

A company using NAS on a budget to discover a compact model architecture optimized for deployment on mobile devices

Recommended Tools

Related Terms