Conditional entropy and error porbability
Files
(Published version)
Date
2008
Authors
Ho, S.W.
Verdu, S.
Editors
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
IEEE International Symposium on Information Theory - Proceedings, 2008, pp.1622-1626
Statement of Responsibility
Conference Name
2008 IEEE International Symposium on Information Theory (6 Jul 2008 - 11 Jul 2008 : Toronto, Canada)
Abstract
Fano's inequality relates the error probability and conditional entropy of a finitely-valued random variable X given another random variable Y. It is not necessarily tight when the marginal distribution of X is fixed. In this paper, we consider both finite and countably infinite alphabets. A tight upper bound on the conditional entropy of X given Y is given in terms of the error probability and the marginal distribution of X. A new lower bound on the conditional entropy for countably infinite alphabet is also found. The equivalence of the reliability criteria of vanishing error probability and vanishing conditional entropy is established in wide generality.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
Copyright 2008 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.