Race, Algorithms, and Causality
Degree type
Graduate group
Discipline
Computer Sciences
Statistics and Probability
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
I investigate how race, algorithms, and causality intersect in this dissertation. The first chapter explores social constructivist viewpoints on race and evaluates Sally Haslanger's perspective. To enhance Quayshawn Spencer's genuine kinds theory, I utilize Philip Dawid's extended conditional independence concept and James Woodward's stability notion. I use this theory of kinds to analyze Haslanger's social kinds and argue that her social kinds are weak genuine, or methodologically operationalized at best. In chapter two, I argue for a position I call causal agnosticism about race. This thesis asserts that it is reasonable for any socially-constructed race to be uncertain about its causal influence on any outcome. This viewpoint is focused on the epistemic reasonableness of causal claims related to race and is separate from any metaphysical claims about the nature of race itself. My argument for causal agnosticism emphasizes the importance of considering the confounding and stability of causal relationships in variable selection, which is vital to the decision-making process in causal inference research. Chapter three explores the problem of learning from data influenced by underrepresentation and intersectional bias in machine learning. Alongside my co-author, Emily Ruth Diana, we present a new model and algorithm to address these biases. Our model takes into account how multiple group memberships can affect dataset biases. Our algorithm utilizes a reweighting technique to estimate the drop-out rates within a group and approximate the potential loss of any hypothesis on the true distribution. In the final chapter, with co-authors Jamelle Watson-Daniels and Camille Harris, I delve into the issue of racial bias in algorithms. We aim to provide a framework for understanding and combating this bias, shedding light on the subtle and often disguised ways racism can manifest in the algorithmic context. We hope to encourage researchers to recognize and address the history of racial neglect and disregard in technical research and development.
Advisor
Spencer, Quayshawn