Beyond Classical Statistics: Optimality in Transfer Learning and Distributed Learning

Hongji Wei, University of Pennsylvania

Abstract

During modern statistical learning practice, statisticians are dealing with increasingly huge, complicated and structured data sets. New opportunities can be found during the learning process with better structured data sets as well as powerful data analytic resources. Also, there are more and more challenges we need to address when dealing with large data sets, due to limitation of computation, communication resources or privacy concerns. Under decision-theoretical framework, statistical optimality should be reconsidered with new type of data or new constraints. Under the framework of minimax theory, this thesis aims to address the following four problems: 1. The first part of this thesis aims to develop an optimality theory for transfer learning for nonparametric classification. An near optimal adaptive classifier is also established. 2. In the second part, we study distributed Gaussian mean estimation with known variance under communication constraints. The exact distributed minimax rate of con- vergence is derived under three different communication protocols. 3. In the third part, we study distributed Gaussian mean estimation with unknown variance under communication constraints. The results show that the amount of additional communication cost depends on the type of underlying communication protocol. 4. In the fourth part, we investigate the minimax optimality and communication cost of adaptation for distributed nonparametric function estimation under communication constraints.

Subject Area

Statistics|Mathematics|Computer science

Recommended Citation

Wei, Hongji, "Beyond Classical Statistics: Optimality in Transfer Learning and Distributed Learning" (2022). Dissertations available from ProQuest. AAI29162821.
https://repository.upenn.edu/dissertations/AAI29162821

Share

COinS