Publication | Open Access
Learning to Feel Textures: Predicting Perceptual Similarities From Unconstrained Finger-Surface Interactions
26
Citations
38
References
2022
Year
Geometric LearningHaptic FeedbackPerceptual SimilaritiesEngineeringMachine LearningFinger-surface InteractionsBiometricsHaptic TechnologyMotor ControlSocial SciencesKinesiologyData ScienceTouch User InterfacePattern RecognitionAffective ComputingPerception SystemMultimodal Human Computer InterfaceCognitive ScienceDistinct Tactile PropertiesDeep LearningSurface SimilarityGesture RecognitionComputer VisionContact ForceFeel TexturesTexture (Visual Arts)
Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.
| Year | Citations | |
|---|---|---|
Page 1
Page 1