Toward Scalable Verification for Safety-Critical Deep Networks
Loading...
Penn collection
Machine Programming
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Kuper, Lindsey
Gottschlich, Justin E
Julian, Kyle
Barrett, Clark
Kochenderfer, Mykel J
Contributor
Abstract
The increasing use of deep neural networks for safety-critical applications, such as autonomous driving and flight control, raises concerns about their safety and reliability. Formal verification can address these concerns by guaranteeing that a deep learning system operates as intended, but the state of the art is limited to small systems. In this work-in-progress report we give an overview of our work on mitigating this difficulty, by pursuing two complementary directions: devising scalable verification techniques, and identifying design choices that result in deep learning systems that are more amenable to verification.
Advisor
Date of presentation
2018-01-01
Conference name
Machine Programming
Conference dates
2023-05-18T00:14:45.000