Date of Award
2022
Degree Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Graduate Group
Statistics
First Advisor
Tony Cai
Abstract
During modern statistical learning practice, statisticians are dealing with increasingly huge, complicated and structured data sets. New opportunities can be found during the learning process with better structured data sets as well as powerful data analytic resources. Also, there are more and more challenges we need to address when dealing with large data sets, due to limitation of computation, communication resources or privacy concerns. Under decision-theoretical framework, statistical optimality should be reconsidered with new type of data or new constraints. Under the framework of minimax theory, this thesis aims to address the following four problems:1. The first part of this thesis aims to develop an optimality theory for transfer learning for nonparametric classification. An near optimal adaptive classifier is also established. 2. In the second part, we study distributed Gaussian mean estimation with known vari- ance under communication constraints. The exact distributed minimax rate of con- vergence is derived under three different communication protocols. 3. In the third part, we study distributed Gaussian mean estimation with unknown vari- ance under communication constraints. The results show that the amount of additional communication cost depends on the type of underlying communication protocol. 4. In the fourth part, we investigate the minimax optimality and communication cost of adaptation for distributed nonparametric function estimation under communication constraints.
Recommended Citation
Wei, Hongji, "Beyond Classical Statistics: Optimality In Transfer Learning And Distributed Learning" (2022). Publicly Accessible Penn Dissertations. 5519.
https://repository.upenn.edu/edissertations/5519