Introducing Information Measures via Inference [Lecture Notes]

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

Information measures, such as the entropy and the Kullback-Leibler (KL) divergence, are typically introduced using an abstract viewpoint based on a notion of "surprise." Accordingly, the entropy of a given random variable (rv) is larger if its realization, when revealed, is on average more "surprising" (see, e.g., [1]-[3]). The goal of this lecture note is to describe a principled and intuitive introduction to information measures that builds on inference, i.e., estimation and hypothesis testing. Specifically, entropy and conditional entropy measures are defined using variational characterizations that can be interpreted in terms of the minimum Bayes risk in an estimation problem. Divergence metrics are similarly described using variational expressions derived via mismatched estimation or binary hypothesis testing principles. The classical Shannon entropy and the KL divergence are recovered as special cases of more general families of information measures.

Original languageEnglish (US)
Article number8254254
Pages (from-to)167-171
Number of pages5
JournalIEEE Signal Processing Magazine
Volume35
Issue number1
DOIs
StatePublished - Jan 2018
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Introducing Information Measures via Inference [Lecture Notes]'. Together they form a unique fingerprint.

  • Cite this