Publication | Closed Access
Competitive learning with floating-gate circuits
378
Citations
19
References
2002
Year
Cluster ComputingEngineeringMachine Learning11-Transistor Silicon CircuitAlgorithmic LearningUnsupervised Machine LearningCompetitive LearningData MiningSparse Neural NetworkUnconventional ComputingFloating-gate CircuitsBump CircuitComputational Learning TheoryKnowledge DiscoveryComputer EngineeringComputer ScienceNeural Architecture SearchCircuit DesignBrain-like Computing
Competitive learning trains clustering and classification networks, and this circuit serves as an ideal building block for such networks. The study demonstrates the adaptive behavior of the automaximizing bump circuit in two ways. The authors built an 11‑transistor automaximizing bump circuit that implements similarity computation, local adaptation, simultaneous adaptation and computation, and nonvolatile storage, and they demonstrated its use for 1‑D data clustering and a general clustering architecture via simulation. Experimental data from 0.35‑µm CMOS circuits confirm the analysis.
Competitive learning is a general technique for training clustering and classification networks. We have developed an 11-transistor silicon circuit, that we term an automaximizing bump circuit, that uses silicon physics to naturally implement a similarity computation, local adaptation, simultaneous adaptation and computation and nonvolatile storage. This circuit is an ideal building block for constructing competitive-learning networks. We illustrate the adaptive nature of the automaximizing bump in two ways. First, we demonstrate a silicon competitive-learning circuit that clusters one-dimensional (1-D) data. We then illustrate a general architecture based on the automaximizing bump circuit; we show the effectiveness of this architecture, via software simulation, on a general clustering task. We corroborate our analysis with experimental data from circuits fabricated in a 0.35-mum CMOS process.
| Year | Citations | |
|---|---|---|
Page 1
Page 1