Publication | Closed Access
Autonomous Navigation of Vehicles from a Visual Memory Using a Generic Camera Model
76
Citations
26
References
2009
Year
EngineeringField RoboticsAutonomous Vehicle NavigationAutonomous VehiclesUrban Electric VehicleRobot LearningVisual MemoryMachine VisionVision RoboticsComplete FrameworkVehicle LocalizationComputer ScienceAutonomous DrivingAutonomous NavigationComputer VisionGeneric Camera ModelOdometryEye TrackingNatural LandmarksRobotics
In this paper, we present a complete framework for autonomous vehicle navigation using a single camera and natural landmarks. When navigating in an unknown environment for the first time, usual behavior consists of memorizing some key views along the performed path to use these references as checkpoints for future navigation missions. The navigation framework for the wheeled vehicles presented in this paper is based on this assumption. During a human-guided learning step, the vehicle performs paths that are sampled and stored as a set of ordered key images, as acquired by an embedded camera. The visual paths are topologically organized, providing a visual memory of the environment. Given an image of the visual memory as a target, the vehicle navigation mission is defined as a concatenation of visual path subsets called visual routes. When autonomously running, the control guides the vehicle along the reference visual route without explicitly planning any trajectory. The control consists of a vision-based control law that is adapted to the nonholonomic constraint. Our navigation framework has been designed for a generic class of cameras (including conventional, catadioptric, and fisheye cameras). Experiments with an urban electric vehicle navigating in an outdoor environment have been carried out with a fisheye camera along a 750-m-long trajectory. Results validate our approach.
| Year | Citations | |
|---|---|---|
Page 1
Page 1