A Kullback–Leibler View of Maximum Entropy and Maximum Log-Probability Methods

Ali E. Abbas, Andrea H. Cadenbach, Ehsan Salimi, Andrea Hupman

Research output: Contribution to journalArticlepeer-review

Abstract

Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback–Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.
Original languageAmerican English
JournalEntropy
Volume19
DOIs
StatePublished - May 19 2017

Keywords

  • entropy
  • minimum cross entropy
  • joint probability distribution

Disciplines

  • Mathematics
  • Applied Mathematics
  • Statistics and Probability

Cite this