Publication | Closed Access
Dimension Reduction by Local Principal Component Analysis
719
Citations
35
References
1997
Year
Nonlinear ExtensionsNonlinear PcaEngineeringMachine LearningData SciencePattern RecognitionDimension ReductionKnowledge DiscoveryMultilinear Subspace LearningSpeech ProcessingComputer ScienceIndependent Component AnalysisDimensionality ReductionMedical Image ComputingPrincipal Component AnalysisNonlinear Dimensionality ReductionSpeech Recognition
Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.
| Year | Citations | |
|---|---|---|
1989 | 20.7K | |
1987 | 10K | |
1989 | 9.3K | |
1988 | 5.4K | |
1989 | 4.2K | |
1991 | 3.6K | |
1985 | 3.4K | |
1991 | 3K | |
1984 | 2.4K | |
1982 | 2.3K |
Page 1
Page 1