Publication | Closed Access
Modality-Constrained Statistical Learning of Tactile, Visual, and Auditory Sequences.
538
Citations
52
References
2005
Year
PsychoacousticsAuditory ImageryMachine LearningNeurolinguisticsHaptic TechnologyPsycholinguisticsCognitionPhonologySocial SciencesPhoneticsLanguage StudiesSensory ModalitiesPsychophysicsMultisensory IntegrationAuditory ProcessingCognitive ScienceModality ConstraintsAuditory ModelingAuditory ModalityComputational NeuroscienceSpeech ProcessingSpeech PerceptionLinguisticsModality-constrained Statistical Learning
The authors investigated the extent to which touch, vision, and audition mediate the processing of statistical regularities within sequential input. Few researchers have conducted rigorous comparisons across sensory modalities; in particular, the sense of touch has been virtually ignored. The current data reveal not only commonalities but also modality constraints affecting statistical learning across the senses. To be specific, the authors found that the auditory modality displayed a quantitative learning advantage compared with vision and touch. In addition, they discovered qualitative learning biases among the senses: Primarily, audition afforded better learning for the final part of input sequences. These findings are discussed in terms of whether statistical learning is likely to consist of a single, unitary mechanism or multiple, modality-constrained ones.
| Year | Citations | |
|---|---|---|
Page 1
Page 1