Publication | Closed Access
Generating Adversarial Driving Scenarios in High-Fidelity Simulators
123
Citations
29
References
2019
Year
Unknown Venue
Artificial IntelligenceSelf-driving SoftwareMachine LearningEngineeringAutonomous Vehicle NavigationSimulationAdvanced Driver-assistance SystemIntelligent SystemsBayesian OptimizationDriver BehaviorAdversarial Machine LearningSystems EngineeringSelf-driving VehicleModeling And SimulationRobot LearningRoad Traffic SafetyComputer ScienceAutonomous DrivingDriver PerformanceHigh-fidelity SimulatorsGenerative Adversarial Network
In recent years self-driving vehicles have become more commonplace on public roads, with the promise of bringing safety and efficiency to modern transportation systems. Increasing the reliability of these vehicles on the road requires an extensive suite of software tests, ideally performed on high-fidelity simulators, where multiple vehicles and pedestrians interact with the self-driving vehicle. It is therefore of critical importance to ensure that self-driving software is assessed against a wide range of challenging simulated driving scenarios. The state of the art in driving scenario generation, as adopted by some of the front-runners of the self-driving car industry, still relies on human input [1]. In this paper we propose to automate the process using Bayesian optimization to generate adversarial self-driving scenarios that expose poorly-engineered or poorly-trained self-driving policies, and increase the risk of collision with simulated pedestrians and vehicles. We show that by incorporating the generated scenarios into the training set of the self-driving policy, and by fine-tuning the policy using vision-based imitation learning we obtain safer self-driving behavior.
| Year | Citations | |
|---|---|---|
Page 1
Page 1