Automating Program Analysis For Differential Privacy
This dissertation explores techniques for automating program analysis, with a focus on validating and securely executing differentially private programs. Differential privacy allows analysts to study general patterns among individuals, while providing strong protections against identity leakage.
To automatically check differential privacy for programs, we develop Fuzzi: a three-level logic for differential privacy. Fuzzi’s lowest level is a general-purpose logic; its middle level is apRHL, a program logic for mechanical construction of differential privacy proofs; and its top level is a novel sensitivity logic for tracking sensitivity bounds, a fundamental building block of differential privacy.
Some differentially private algorithms have sophisticated proofs that cannot be derived by a compositional typechecking process. To detect incorrect implementations for these algorithms, we develop DPCheck for testing differential privacy automatically. Adapting a well-known “pointwise” proof technique for differential privacy, DPCheck observes runtime program behaviors, and derives formulas that constrain potential privacy proofs.
Once we are convinced that a program is differentially private, we often still have to trust that the machine executing the program does not misbehave and leak sensitive results. For analytics at scale, computation is often delegated to networked computers that may become compromised. To securely run differentially private analytics at scale, we develop Orchard, a system that can answer many differentially private queries over data distributed among millions of user devices. Orchard leverages cryptographic primitives to employ untrusted computers, while preventing untrusted computers from observing sensitive results.