Publication | Open Access
Manipulation-based active search for occluded objects
43
Citations
14
References
2013
Year
Unknown Venue
Artificial IntelligenceEngineeringField RoboticsIntelligent RoboticsActive Visual SearchCognitive RoboticsObject ManipulationRange SearchingComputer-aided DesignIntelligent SystemsImage AnalysisObject SearchRobot LearningComputational GeometryRobotics PerceptionGeometric ModelingMachine VisionComputer ScienceComputer VisionPr2 RobotManipulation-based Active SearchDevelopmental RoboticsNatural SciencesObject RecognitionAutomationExtended RealityRobotics
Object search is an integral part of daily life, and in the quest for competent mobile manipulation robots it is an unavoidable problem. Previous approaches focus on cases where objects are in unknown rooms but lying out in the open, which transforms object search into active visual search. However, in real life, objects may be in the back of cupboards occluded by other objects, instead of conveniently on a table by themselves. Extending search to occluded objects requires a more precise model and tighter integration with manipulation. We present a novel generative model for representing container contents by using object co-occurrence information and spatial constraints. Given a target object, a planner uses the model to guide an agent to explore containers where the target is likely, potentially needing to move occluding objects to enable further perception. We demonstrate the model on simulated domains and a detailed simulation involving a PR2 robot.
| Year | Citations | |
|---|---|---|
Page 1
Page 1