Publication | Closed Access
Energy Efficient Neural Computing: A Study of Cross-Layer Approximations
45
Citations
34
References
2018
Year
Artificial IntelligenceConvolutional Neural NetworkDeep Neural NetworksEngineeringMachine LearningData ScienceEnergy EfficiencyComputational NeuroscienceSparse Neural NetworkComputer EngineeringEmbedded Machine LearningComputer ScienceNeuromorphic EngineeringBrain-like ComputingDeep LearningNeural Architecture SearchNeurocomputersCross-layer Approximations
Deep neural networks (DNNs) have emerged as the state-of-the-art technique in a wide range of machine learning tasks for analytics and computer vision in the next generation of embedded (mobile, IoT, and wearable) devices. Despite their success, they suffer from high energy requirements. In recent years, the inherent error resiliency of DNNs has been exploited by introducing approximations at either the algorithmic or the hardware levels (individually) to obtain energy savings while incurring tolerable accuracy degradation. However, there is a need for investigating the overall energy-accuracy trade-offs arising from the introduction of approximations at different levels in complex DNNs. We perform a comprehensive analysis to determine the effectiveness of cross-layer approximations for the energy-efficient realization of large-scale DNNs. The approximations considered are as follows: 1) use of lower complexity networks (containing lesser number of layers and/or neurons per layer); 2) pruning of synaptic weights; 3) approximate multiplication operation in the neuronal multiply-and-accumulate computation; and 4) approximate write/read operations to/from the synaptic memory. Our experiments on recognition benchmarks (MNIST and CIFAR10) show that cross-layer approximation provides substantial improvements in energy efficiency for different accuracy/quality requirements. Furthermore, we propose a synergistic framework for combining the approximation techniques to achieve maximal energy benefits from approximate DNNs.
| Year | Citations | |
|---|---|---|
Page 1
Page 1