Publication | Closed Access
Fusion of Inertial and Visual Measurements for RGB-D SLAM on Mobile Devices
29
Citations
18
References
2015
Year
Unknown Venue
Location TrackingEngineeringLocation EstimationMobile DevicesField RoboticsLocalizationMappingSimultaneous LocalizationPure Visual SlamKinematicsSensor FusionRgb-d SlamCartographyInertial SensorsMachine VisionVision RoboticsVehicle LocalizationAutonomous NavigationComputer VisionVisual MeasurementsOdometryEye TrackingRoboticsInertial Measurements
Simultaneous Localization and Mapping (SLAM) algorithms have been recently deployed on mobile devices, where they can enable a broad range of novel applications. Nevertheless, pure visual SLAM is inherently weak at operating in environments with a reduced number of visual features. Indeed, even many recent proposals based on RGB-D sensors cannot handle properly such scenarios, as several steps of the algorithms are based on matching visual features. In this work we propose a framework suitable for mobile platforms to fuse pose estimations attained from visual and inertial measurements, with the aim of extending the range of scenarios addressable by mobile visual SLAM. The framework deploys an array of Kalman filters where the careful selection of the state variables and the preprocessing of the inertial sensor measurements result in a simple and effective data fusion process. We present qualitative and quantitative experiments to show the improved SLAM performance delivered by the proposed approach.
| Year | Citations | |
|---|---|---|
Page 1
Page 1