Date of Award


Degree Type


Degree Name

Doctor of Philosophy (PhD)

Graduate Group

Computer and Information Science

First Advisor

Andreas Haeberlen


Recent growth in the size and scope of databases has resulted in more

research into making productive use of this data. Unfortunately, a

significant stumbling block which remains is protecting the privacy of

the individuals that populate these datasets. As people spend more

time connected to the Internet, and conduct more of their daily lives

online, privacy becomes a more important consideration, just as the

data becomes more useful for researchers, companies, and

individuals. As a result, plenty of important information remains

locked down and unavailable to honest researchers today, due to fears

that data leakages will harm individuals.

Recent research in differential privacy opens a promising pathway to

guarantee individual privacy while simultaneously making use of the

data to answer useful queries. Differential privacy is a theory that

provides provable information theoretic guarantees on what any answer

may reveal about any single individual in the database. This approach

has resulted in a flurry of recent research, presenting novel

algorithms that can compute a rich class of computations in this


In this dissertation, we focus on some real world challenges that

arise when trying to provide differential privacy guarantees in the

real world. We design and build runtimes that achieve the mathematical

differential privacy guarantee in the face of three real world

challenges: securing the runtimes against adversaries, enabling

readers to verify that the answers are accurate, and dealing with data

distributed across multiple domains.