Concepedia

TLDR

Humans integrate multiple sensory cues, such as visual disparity and haptic texture, to estimate object properties, but combining cues can reduce the distinct information each cue provides. We found that combining cues within a single modality (e.g., visual disparity and texture) reduces single‑cue information, whereas combining across modalities (vision and haptics) preserves it.

Abstract

Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.

References

YearCitations

Page 1