Concepedia

Publication | Closed Access

COEB-SLAM: A Robust VSLAM in Dynamic Environments Combined Object Detection, Epipolar Geometry Constraint, and Blur Filtering

33

Citations

28

References

2023

Year

Abstract

Simultaneous localization and mapping (SLAM) systems are proposed to estimate mobile robot’ poses and reconstruct maps of surrounding environments. Most SLAM systems assume that their working environments are static. However, there are many dynamic objects in actual environments, which reduce the accuracy and robustness of the SLAM system. In this article, a real-time robust visual simultaneous localization and mapping (VSLAM) system working in dynamic environments named combine object detection, epipolar geometry constraint, and blur filtering (COEB-SLAM) is proposed, which combines object detection, epipolar geometry constraint, and blur filtering to remove dynamic feature points. In reality, shooting fast-moving objects can cause image motion blur, which affects the dynamic SLAM’s ability to determine the object’s state. Therefore, we attempted to use the blurring level as a constraint to identify dynamic objects. Moreover, the removal of a large number of feature points can lead to the failure of tracking. Thus, we use a feature points redistribution method for retaining enough high-quality static features to reduce this impact. Extensive evaluations in the public TUM RGB-D datasets and the real-world dynamic environments demonstrate that COEB-SLAM has robustness in dynamic environments. Compared to the ORB-SLAM2 system, COEB-SLAM reduces the average absolute trajectory error by about 90% in TUM RGB-D datasets. For the benefit of community, we make public the source code at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/biscuitzb/COEB-SLAM</uri> .

References

YearCitations

Page 1