Date of this Version
IEEE Transactions on Information Theory
Common approximations for the minimum description length (MDL) criterion imply that the cost of adding a parameter to a model fit to n observations is about (1/2) log n bits. While effective for parameters which are large on a standardized scale, this approximation overstates the parameter cost near zero. A uniform approximation and local asymptotic argument show that the addition of a small parameter which is about two standard errors away from zero produces a model whose description length is shorter than that of the comparable model which sets this parameter to zero. This result implies that the decision rule for adding a model parameter is comparable to a traditional statistical hypothesis test. Encoding the parameter produces a shorter description length when the corresponding estimator is about two standard errors away from zero, unlike a model selection criterion like BIC whose threshold increases logarithmically in n.
BIC, hypothesis test, model selection, two-part code, universal code
Foster, D. P., & Stine, R. A. (1999). Local Asymptotics and the Minimum Description Length. IEEE Transactions on Information Theory, 45 (4), 1289-1293. http://dx.doi.org/10.1109/18.761287
Date Posted: 27 November 2017
This document has been peer reviewed.