Date of this Version
Journal of Statistical Planning and Inference
Detecting significance in a high-dimensional sparse data structure has received a large amount of attention in modern statistics. In the current paper, we introduce a compound decision rule to simultaneously classify signals from noise. This procedure is a Bayes rule subject to a mixture loss function. The loss function minimizes the number of false discoveries while controlling the false nondiscoveries by incorporating the signal strength information. Based on our criterion, strong signals will be penalized more heavily for nondiscovery than weak signals. In constructing this classification rule, we assume a mixture prior for the parameter which adapts to the unknown sparsity. This Bayes rule can be viewed as thresholding the “local fdr” (Efron, 2007) by adaptive thresholds. Both parametric and nonparametric methods will be discussed. The nonparametric procedure adapts to the unknown data structure well and outperforms the parametric one. Performance of the procedure is illustrated by various simulation studies and a real data application.
© 2014. This manuscript version is made available under the CC-BY-NC-ND 4.0 license.
high dimensional sparse inference, Bayes classification rule, nonparametric estimation, false discoveries, false nondiscoveries
Fuki, I., Brown, L. D., Han, X., & Zhao, L. (2014). Hunting for Significance: Bayesian Classifiers Under a Mixture Loss Function. Journal of Statistical Planning and Inference, 154 62-71. http://dx.doi.org/10.1016/j.jspi.2014.02.010
Date Posted: 27 November 2017
This document has been peer reviewed.