Publication | Open Access
Using Floating-Gate Memory to Train Ideal Accuracy Neural Networks
43
Citations
15
References
2019
Year
EngineeringMachine LearningNeural Networks (Machine Learning)Energy EfficiencyComputer ArchitectureFloating-gate MemoryIntegrated CircuitsNeurochipSocial SciencesComputing SystemsNeuromorphic EngineeringNeurocomputersElectrical EngineeringComputer EngineeringComputer ScienceNeural Networks (Computational Neuroscience)Neural NetworksFloating-gate Silicon-oxygen-nitrogen-oxygen-siliconNeural Architecture SearchMicroelectronicsHardware AccelerationBrain-like Computing
Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by 120×, operate 2.1× faster, and require 5× lower area than an optimized SRAM-based ASIC.
| Year | Citations | |
|---|---|---|
Page 1
Page 1