Concepedia

Publication | Closed Access

Robust visual inertial odometry using a direct EKF-based approach

776

Citations

20

References

2015

Year

TLDR

The authors propose a monocular visual‑inertial odometry algorithm that directly uses pixel intensity errors of image patches to achieve accurate tracking with high robustness. The method couples multilevel patch tracking to an extended Kalman filter, employing a robocentric, inverse‑distance landmark parametrization that eliminates initialization and improves consistency and computational speed. The approach is validated in highly dynamic hand‑held experiments and successfully integrated into the control loop of a multirotor UAV.

Abstract

In this paper, we present a monocular visual-inertial odometry algorithm which, by directly using pixel intensity errors of image patches, achieves accurate tracking performance while exhibiting a very high level of robustness. After detection, the tracking of the multilevel patch features is closely coupled to the underlying extended Kalman filter (EKF) by directly using the intensity errors as innovation term during the update step. We follow a purely robocentric approach where the location of 3D landmarks are always estimated with respect to the current camera pose. Furthermore, we decompose landmark positions into a bearing vector and a distance parametrization whereby we employ a minimal representation of differences on a corresponding σ-Algebra in order to achieve better consistency and to improve the computational performance. Due to the robocentric, inverse-distance landmark parametrization, the framework does not require any initialization procedure, leading to a truly power-up-and-go state estimation system. The presented approach is successfully evaluated in a set of highly dynamic hand-held experiments as well as directly employed in the control loop of a multirotor unmanned aerial vehicle (UAV).

References

YearCitations

Page 1