Approaches Addressing Algorithmic Bias and Disclosiveness

Loading...
Thumbnail Image
Degree type
Doctor of Philosophy (PhD)
Graduate group
Statistics
Discipline
Statistics and Probability
Computer Sciences
Subject
Funder
Grant number
License
Copyright date
2023
Distributor
Related resources
Author
Diana, Emily, Ruth
Contributor
Abstract

While data science enables rapid societal advancement, deferring decisions to machines does not automatically avoid egregious equity or privacy violations. Without safeguards in the scientific process — from data collection to algorithm design to model deployment — machine learning models can easily inherit or amplify existing biases and vulnerabilities present in society. This dissertation focuses on techniques to encode algorithms with ethical norms and construct frameworks ensuring that statistics and machine learning methods are deployed in a socially responsible manner. In particular, it presents theoretically rigorous and empirically verified techniques to mitigate automated bias and protect individual privacy. We begin with a discussion of two definitional contributions to the algorithmic fairness literature: minimax group fairness and lexicographic fairness. In contrast to the popular fairness goal of achieving equality of certain statistics between groups (such as error rate, false positive rate, or selection rate), the approach of minimax group fairness aims to make the worst off group as well off as possible. As a natural extension, lexicographic fairness applies this goal recursively. Next, we explore a financial scenario in which clients involved in securities lending are incentivized to lie about their demands to protect their privacy. Levering techniques from the differential privacy literature, we present a resource allocation algorithm that is simultaneously private, approximately optimal, and incentivizes truth-telling. Finally, we consider the setting in which features that we might like to attain fairness or balance with respect to are not available in the majority of our data and introduce a pair of algorithms for the strategic development of proxy attributes under equity constraints.

Advisor
Kearns, Michael
Roth, Aaron
Date of degree
2023
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation