Publication | Closed Access
BarrierNet: Differentiable Control Barrier Functions for Learning of Safe Robot Control
93
Citations
46
References
2023
Year
Artificial IntelligenceEngineeringMachine LearningField RoboticsIntelligent RoboticsLearning ControlRobotic ControlDcbf SolutionRobot Traffic MergingTrajectory PlanningSystems EngineeringRobot LearningSafe Robot ControlIntelligent ControlComputer EngineeringComputer ScienceAutonomous DrivingSafety ControlDeep LearningRobot ControlAerospace EngineeringRobotics
Many safety-critical applications of neural networks, such as robotic control, require safety guarantees. This article introduces a method for ensuring the safety of learned models for control using differentiable control barrier functions (dCBFs). dCBFs are end-to-end trainable and guarantee safety. They improve over classical control barrier functions (CBFs), which are usually overly conservative. Our dCBF solution relaxes the CBF definitions by: 1) using environmental dependencies; 2) embedding them into differentiable quadratic programs. These novel safety layers are called a BarrierNet. They can be used in conjunction with any neural network-based controller. They are trained by gradient descent. With BarrierNet, the safety constraints of a neural controller become adaptable to changing environments. We evaluate BarrierNet on the following several problems: 1) robot traffic merging; 2) robot navigation in 2-D and 3-D spaces; 3) end-to-end vision-based autonomous driving in a sim-to-real environment and in physical experiments; 4) demonstrate their effectiveness compared to state-of-the-art approaches.
| Year | Citations | |
|---|---|---|
Page 1
Page 1