Kullback Leibler Property of Kernel Mixture Priors in Bayesian Density Estimation

Yuefeng Wu, Subhashis Ghosal

Research output: Contribution to journalArticlepeer-review

Abstract

Positivity of the prior probability of Kullback-Leibler neighborhood around the true density, commonly known as the Kullback-Leibler property, plays a fundamental role in posterior consistency. A popular prior for Bayesian estimation is given by a Dirichlet mixture, where the kernels are chosen depending on the sample space and the class of densities to be estimated. The Kullback-Leibler property of the Dirichlet mixture prior has been shown for some special kernels like the normal density or Bernstein polynomial, under appropriate conditions. In this paper, we obtain easily verifiable sufficient conditions, under which a prior obtained by mixing a general kernel possesses the Kullback-Leibler property. We study a wide variety of kernel used in practice, including the normal,  t , histogram, gamma, Weibull densities and so on, and show that the Kullback-Leibler property holds if some easily verifiable conditions are satisfied at the true density. This gives a catalog of conditions required for the Kullback-Leibler property, which can be readily used in applications.
Original languageAmerican English
JournalElectronic Journal of Statistics
Volume2
StatePublished - 2008

Disciplines

  • Physical Sciences and Mathematics

Cite this