Publication | Closed Access
Modeling spatial referencing language for human-robot interaction
17
Citations
4
References
2006
Year
Unknown Venue
Artificial IntelligenceLanguage GroundingEngineeringRobotic AgentField RoboticsIntelligent RoboticsCognitive RoboticsIntelligent SystemsSpatial Referencing LanguageComputational LinguisticsHumanrobot CollaborationSpatial LanguageRobot LearningLanguage StudiesComputational GeometryRobotics PerceptionScience FictionVision RoboticsComputer ScienceComputer VisionContour PointsAutomationRoboticsLinguistics
It has long been a dream of science fiction to have a robot which understands the richness of spoken language. Part of attaining this goal is to exploit the ways that language structures space which would lead to a more natural way to interact with our robots. By investigating the application of spatial language as a modality of interacting with robots, we can create an interface that is more intuitive for a novice user. In this paper, we outline a method for computing target points to the FRONT, LEFT, RIGHT, and BEHIND segmented objects in evidence grid maps built using range data on a mobile robot. This method uses the segmented objects and their contour points to calculate eigenvectors which can be used to calculate the LEFT, RIGHT, FRONT, and BEHIND points. This allows the user to issue spatial referencing commands, such as "go behind the desk" or "look to the left of the table." We also show results for a human-subject experiment which provides validation of the algorithm.
| Year | Citations | |
|---|---|---|
Page 1
Page 1