Publication | Closed Access
DriveIRL: Drive in Real Life with Inverse Reinforcement Learning
31
Citations
19
References
2023
Year
Unknown Venue
Artificial IntelligencePath PlanningInverse Reinforcement LearningLas VegasTrajectory PlanningMachine LearningData ScienceLas Vegas StripAutonomous LearningEngineeringAutomationSystems EngineeringAction Model LearningComputer ScienceIntelligent SystemsRobot LearningAutonomous DrivingPlanning
In this paper, we introduce the first published planner to drive a car in dense, urban traffic using Inverse Reinforcement Learning (IRL). Our planner, DriveIRL, generates a diverse set of trajectory proposals and scores them with a learned model. The best trajectory is tracked by our self-driving vehicle's low-level controller. We train our trajectory scoring model on a 500+ hour real-world dataset of expert driving demonstrations in Las Vegas within the maximum entropy IRL framework. DriveIRL's benefits include: a simple design due to only learning the trajectory scoring function, a flexible and relatively interpretable feature engineering approach, and strong real-world performance. We validated DriveIRL on the Las Vegas Strip and demonstrated fully autonomous driving in heavy traffic, including scenarios involving cut-ins, abrupt braking by the lead vehicle, and hotel pickup/dropoff zones. Our dataset, a part of nuPlan, has been released to the public to help further research in this area.
| Year | Citations | |
|---|---|---|
Page 1
Page 1