Publication | Closed Access
Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses
495
Citations
7
References
2002
Year
EngineeringCognitionCommunicationAttentionSensory ScienceSensory SystemsSocial SciencesSensory IntegrationSensory PerceptionSensometricsCognitive NeurosciencePsychophysicsMultisensory IntegrationPerception SystemMultimodal PerceptionCognitive ScienceMachine VisionTexture GradientsVision ResearchComputer VisionVisual FunctionEye TrackingNeuroscienceMultiple SourcesStereoscopic ProcessingSensory Information
Humans integrate multiple sensory cues, such as visual disparity and haptic texture, to estimate object properties, but combining cues can reduce the distinct information each cue provides. We found that combining cues within a single modality (e.g., visual disparity and texture) reduces single‑cue information, whereas combining across modalities (vision and haptics) preserves it.
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.
| Year | Citations | |
|---|---|---|
Page 1
Page 1