Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow

Loading...
Thumbnail Image
Penn collection
Statistics Papers
Degree type
Discipline
Subject
Iterative adaptive thresholding
minimax rate
non-convex empirical risk
phase retrieval
sparse recovery
thresholded gradient method
Physical Sciences and Mathematics
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Cai, Tony
Li, Xiadong
Ma, Zongming
Contributor
Abstract

This paper considers the noisy sparse phase retrieval problem: recovering a sparse signal x ∈ ℝp from noisy quadratic measurements yj = (a′jx)2+εj, j=1,…,m, with independent sub-exponential noise εj. The goals are to understand the effect of the sparsity of x on the estimation precision and to construct a computationally feasible estimator to achieve the optimal rates adaptively. Inspired by the Wirtinger Flow [IEEE Trans. Inform. Theory 61 (2015) 1985–2007] proposed for non-sparse and noiseless phase retrieval, a novel thresholded gradient descent algorithm is proposed and it is shown to adaptively achieve the minimax optimal rates of convergence over a wide range of sparsity levels when the aj’s are independent standard Gaussian random vectors, provided that the sample size is sufficiently large compared to the sparsity of x.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2016-01-01
Journal title
The Annals of Statistics
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection