Toward Scalable Verification for Safety-Critical Deep Networks

Loading...
Thumbnail Image
Penn collection
Machine Programming
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Kuper, Lindsey
Katz, Guy
Julian, Kyle
Barrett, Clark
Kochenderfer, Mykel J
Contributor
Abstract

The increasing use of deep neural networks for safety-critical applications, such as autonomous driving and flight control, raises concerns about their safety and reliability. Formal verification can address these concerns by guaranteeing that a deep learning system operates as intended, but the state of the art is limited to small systems. In this work-in-progress report we give an overview of our work on mitigating this difficulty, by pursuing two complementary directions: devising scalable verification techniques, and identifying design choices that result in deep learning systems that are more amenable to verification.

Advisor
Date of presentation
2018-01-01
Conference name
Machine Programming
Conference dates
2023-05-18T00:14:45.000
Conference location
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection