Publication | Open Access
Sensor Data Fusion for a Mobile Robot Using Neural Networks
35
Citations
23
References
2021
Year
Robotic SystemsEngineeringSensor Data FusionNeural NetworkField RoboticsMulti-sensor Information FusionIntelligent SystemsMultimodal Sensor FusionSystems EngineeringRobot LearningSensor FusionRobotics PerceptionDecision FusionMachine VisionMulti-sensor ManagementData FusionRobot PerceptionRobotic SensingComputer ScienceComputer VisionFusion AlgorithmRoboticsRobot SensingArtificial Neural Network
Mobile robots require accurate environmental mapping, and sensor fusion is needed to detect materials that single sensors may miss. The study aims to generate a 2‑D occupancy map that identifies glass obstacles by fusing data from multiple sensors using a neural network. An artificial neural network fuses data from a RealSense stereo camera, 360° LiDAR, and ultrasonic sensors after preprocessing to filter outliers and project 3‑D pointclouds to 2‑D, and the Turtlebot3 Waffle Pi robot implements the system. The fusion algorithm achieved glass and obstacle detection with an RMSE of 3 cm across multiple strategies.
Mobile robots must be capable to obtain an accurate map of their surroundings to move within it. To detect different materials that might be undetectable to one sensor but not others it is necessary to construct at least a two-sensor fusion scheme. With this, it is possible to generate a 2D occupancy map in which glass obstacles are identified. An artificial neural network is used to fuse data from a tri-sensor (RealSense Stereo camera, 2D 360° LiDAR, and Ultrasonic Sensors) setup capable of detecting glass and other materials typically found in indoor environments that may or may not be visible to traditional 2D LiDAR sensors, hence the expression improved LiDAR. A preprocessing scheme is implemented to filter all the outliers, project a 3D pointcloud to a 2D plane and adjust distance data. With a Neural Network as a data fusion algorithm, we integrate all the information into a single, more accurate distance-to-obstacle reading to finally generate a 2D Occupancy Grid Map (OGM) that considers all sensors information. The Robotis Turtlebot3 Waffle Pi robot is used as the experimental platform to conduct experiments given the different fusion strategies. Test results show that with such a fusion algorithm, it is possible to detect glass and other obstacles with an estimated root-mean-square error (RMSE) of 3 cm with multiple fusion strategies.
| Year | Citations | |
|---|---|---|
Page 1
Page 1