SCREENING OUT-OF-DISTRIBUTION INPUTS FOR RELIABLE AI PREDICTIONS

Loading...
Thumbnail Image
Degree type
Doctor of Philosophy (PhD)
Graduate group
Computer and Information Science
Discipline
Computer Sciences
Subject
Covariate Shifts
Deep Learning
Out-of-Distribution
Reliable Predictions
Training Distribution
Trustworthy Deployment
Funder
Grant number
License
Copyright date
2023
Distributor
Related resources
Author
Kaur, Ramneet
Contributor
Abstract

With the remarkable performance of machine learning models such as Deep neural networks (DNNs) across domains, there is significant interest in deploying these models in real-world AI systems. DNNs are, however, known to produce incorrect predictions on inputs from scenarios that are less likely to occur according to their training distribution. This limitation is one of the key challenges in the adoption of DNNs in real-world high-assurance systems such as autonomous driving and medical diagnosis, where unexpected scenarios are unavoidable. In this thesis, we propose self-supervised detectors for screening inputs to DNNs from novel scenarios due to (1) unknown classes, and (2) covariate shifts. These detectors come with theoretical guarantees on the bounded false alarm rates. Existing detectors for unknown classes are tied to the spurious training features. We propose scope screening for unknown classes to be performed conditioned on the classifier’s robustness to spurious training features as detection w.r.t spurious features restricts the utility of a classifier generalizable beyond its training distribution. An empirical approach for predicting the generalizable capabilities of a DNN classifier in novel scenarios due to covariate shifts is also presented in this thesis. We demonstrate that the proposed approaches are applicable (1) across data modalities – physiological, vision, audio, and point-cloud data, and (2) for different practical applications – distribution shifts for perception modules in autonomous cars and classification modules for GAIT analysis in medical systems.

Advisor
Lee, Insup, Prof.
Sokolsky, Oleg, Prof.
Date of degree
2023
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation