Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Neural Architecture Search (NAS) is an emerging area focused on automating the design of neural network architectures. The overarching goal of NAS research is to discover optimal network architectures that can achieve high performance for specific tasks, such as image recognition or natural language processing, without requiring extensive human expertise. Researchers aim to develop algorithms that can efficiently explore the vast space of possible architectures, identify the most promising ones, and refine them to enhance their performance. The current landscape of NAS is characterized by a variety of approaches, including reinforcement learning, evolutionary algorithms, and gradient-based methods. Developments in NAS are oriented towards reducing the computational cost of the search process, improving the generalizability of the existing architectures, and exploring architectures for trending domains such as edge computing and neural network compression.
This Collection presents original research highlighting efforts in creating more efficient and scalable NAS algorithms that can adapt to different tasks and hardware constraints.