Publication | Closed Access
Multimodal integration learning of object manipulation behaviors using deep neural networks
20
Citations
16
References
2013
Year
Unknown Venue
Artificial IntelligenceMultiple Behavior PatternsMachine LearningEngineeringIntelligent RoboticsMultimodal LearningCognitive RoboticsObject ManipulationMultimodal Integration LearningObject Manipulation BehaviorsSocial SciencesVideo InterpretationMultimodal InteractionRobot LearningHumanoid RobotCognitive ScienceVisuomotor LearningMotion SynthesisMultimodal Signal ProcessingComputer ScienceDeep LearningComputer VisionDeep Neural NetworksRobotics
This paper presents a novel computational approach for modeling and generating multiple object manipulation behaviors by a humanoid robot. The contribution of this paper is that deep learning methods are applied not only for multimodal sensor fusion but also for sensory-motor coordination. More specifically, a time-delay deep neural network is applied for modeling multiple behavior patterns represented with multi-dimensional visuomotor temporal sequences. By using the efficient training performance of Hessian-free optimization, the proposed mechanism successfully models six different object manipulation behaviors in a single network. The generalization capability of the learning mechanism enables the acquired model to perform the functions of cross-modal memory retrieval and temporal sequence prediction. The experimental results show that the motion patterns for object manipulation behaviors are successfully generated from the corresponding image sequence, and vice versa. Moreover, the temporal sequence prediction enables the robot to interactively switch multiple behaviors in accordance with changes in the displayed objects.
| Year | Citations | |
|---|---|---|
Page 1
Page 1