Statistics Papers

Document Type

Journal Article

Date of this Version

5-1999

Publication Source

IEEE Transactions on Information Theory

Volume

45

Issue

4

Start Page

1289

Last Page

1293

DOI

10.1109/18.761287

Abstract

Common approximations for the minimum description length (MDL) criterion imply that the cost of adding a parameter to a model fit to n observations is about (1/2) log n bits. While effective for parameters which are large on a standardized scale, this approximation overstates the parameter cost near zero. A uniform approximation and local asymptotic argument show that the addition of a small parameter which is about two standard errors away from zero produces a model whose description length is shorter than that of the comparable model which sets this parameter to zero. This result implies that the decision rule for adding a model parameter is comparable to a traditional statistical hypothesis test. Encoding the parameter produces a shorter description length when the corresponding estimator is about two standard errors away from zero, unlike a model selection criterion like BIC whose threshold increases logarithmically in n.

Keywords

BIC, hypothesis test, model selection, two-part code, universal code

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.