Statistics Papers

Document Type

Journal Article

Date of this Version

6-2002

Publication Source

IEEE Transactions on Information Theory

Volume

48

Issue

6

Start Page

1713

Last Page

1720

DOI

10.1109/TIT.2002.1003851

Abstract

We offer two noiseless codes for blocks of integers Xn = (X1, ..., Xn). We provide explicit bounds on the relative redundancy that are valid for any distribution F in the class of memoryless sources with a possibly infinite alphabet whose marginal distribution is monotone. Specifically, we show that the expected code length L (Xn) of our first universal code is dominated by a linear function of the entropy of Xn. Further, we present a second universal code that is efficient in that its length is bounded by nHF + o(nHF), where HF is the entropy of F which is allowed to vary with n. Since these bounds hold for any n and any monotone F we are able to show that our codes are strongly minimax with respect to relative redundancy (as defined by Elias (1975)). Our proofs make use of the elegant inequality due to Aaron Wyner (1972)

Keywords

codes, entropy, memoryless systems, sequences, statistical analysis, code length, elegant inequality, entropy, explicit bounds, finite sequences, infinite alphabet, linear function, memoryless sources;minimax code, monotone marginal distribution, noiseless codes, relative redundancy, universal codes, Codes, Control systems, estimation theory, hidden Markov models, information theory, memoryless systems, notice of violation, stochastic processes, testing, world wide web

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.