Publication | Closed Access
Towards a Robust Interactive and Learning Social Robot
24
Citations
14
References
2018
Year
Unknown Venue
Artificial IntelligenceEngineeringMachine LearningSocially Assistive RobotCognitive RoboticsIntelligent SystemsPepper PerceptionSpeech RecognitionRobot LearningEmbodied RoboticsHumanoid RobotHealth SciencesDifferent ModalitySocial InteractionComputer ScienceDeep LearningHuman-robot InteractionComputer VisionAutomationRobust InteractiveSpeech ProcessingSpeech InputRobotics
Pepper is a humanoid robot, specifically designed for social interaction, that has been deployed in a variety of public environments. A programmable version of Pepper is also available, enabling our focused research on perception and behavior robustness and capabilities of an interactive social robot. We address Pepper perception by integrating state-of-the-art vision and speech recognition systems and experimentally analyzing their effectiveness. As we recognize limitations of the individual perceptual modalities, we introduce a multi-modality approach to increase the robustness of human social interaction with the robot. We combine vision, gesture, speech, and input from an onboard tablet, a remote mobile phone, and external microphones. Our approach includes the proactive seeking of input from a different modality, adding robustness to the failures of the separate components. We also introduce a learning algorithm to improve communication capabilities over time, updating speech recognition through social interactions. Finally, we realize the rich robot body-sensory data and introduce both a nearest-neighbor and a deep learning approach to enable Pepper to classify and speak up a variety of its own body motions. We view the contributions of our work to be relevant both to Pepper specifically and to other general social robots.
| Year | Citations | |
|---|---|---|
Page 1
Page 1