Publication | Closed Access
Backpropagation through nonlinear units for the all-optical training of neural networks
85
Citations
36
References
2021
Year
Geometric LearningConvolutional Neural NetworkEngineeringMachine LearningAll-optical TrainingRecurrent Neural NetworkOptical ComputingSparse Neural NetworkOptical SystemsNonlinear UnitsSaturable AbsorptionNon-linear OpticNonlinear Signal ProcessingComputer ScienceNeural NetworksDeep LearningNeural Architecture SearchPractical SchemeComputational NeuroscienceNeuronal Network
We propose a practical scheme for end-to-end optical backpropagation in neural networks. Using saturable absorption for the nonlinear units, we find that the backward-propagating gradients required to train the network can be approximated in a surprisingly simple pump-probe scheme that requires only simple passive optical elements. Simulations show that, with readily obtainable optical depths, our approach can achieve equivalent performance to state-of-the-art computational networks on image classification benchmarks, even in deep networks with multiple sequential gradient approximation. With backpropagation through nonlinear units being an outstanding challenge to the field, this work provides a feasible path toward truly all-optical neural networks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1