Publication | Closed Access
Hardware/Software Co-Exploration of Neural Architectures
144
Citations
40
References
2020
Year
Artificial IntelligenceEngineeringMachine LearningComputer ArchitectureSystem-level DesignHardware SystemsHigh-performance ArchitectureSearch SpaceComputing SystemsNeurocomputersTechnology Co-optimizationPrune Inferior ArchitecturesComputer EngineeringHardware OptimizationComputer ScienceNeural Architecture SearchArchitecture Search SpaceNeural ArchitecturesHardware AccelerationComputational NeuroscienceDomain-specific AcceleratorBrain-like Computing
We propose a novel hardware and software co-exploration framework for efficient neural architecture search (NAS). Different from existing hardware-aware NAS which assumes a fixed hardware design and explores the <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">NAS space</i> only, our framework simultaneously explores both the architecture search space and the <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">hardware design space</i> to identify the best neural architecture and hardware pairs that maximize both test accuracy and hardware efficiency. Such a practice greatly opens up the design freedom and pushes forward the Pareto frontier between hardware efficiency and test accuracy for better design tradeoffs. The framework iteratively performs a two-level (fast and slow) exploration. Without lengthy training, the fast exploration can effectively fine-tune hyperparameters and prune inferior architectures in terms of hardware specifications, which significantly accelerates the NAS process. Then, the slow exploration trains candidates on a validation set and updates a controller using the reinforcement learning to maximize the expected accuracy together with the hardware efficiency. In this article, we demonstrate that the co-exploration framework can effectively expand the search space to incorporate models with high accuracy, and we theoretically show that the proposed two-level optimization can efficiently prune inferior solutions to better explore the search space. The experimental results on ImageNet show that the co-exploration NAS can find solutions with the same accuracy, 35.24% higher throughput, 54.05% higher energy efficiency, compared with the hardware-aware NAS.
| Year | Citations | |
|---|---|---|
Page 1
Page 1