Asymptotic Properties for Methods Combining the Minimum Hellinger Distance Estimate and the Bayesian Nonparametric Density Estimate

Yuefeng Wu, Giles Hooker

Research output: Contribution to journalArticlepeer-review

Abstract

In frequentist inference, minimizing the Hellinger distance between a kernel density estimate and a parametric family produces estimators that are both robust to outliers and statistically efficienty when the parametric model is correct. This paper seeks to extend these results to the use of nonparametric Bayesian density estimators within disparity methods. We propose two estimators: one replaces the kernel density estimator with the expected posterior density from a random histogram prior; the other induces a posterior over parameters through the posterior for the random histogram. We show that it is possible to adapt the mathematical machinery of efficient influence functions from semiparametric models to demonstrate that both our estimators are efficient in the sense of achieving the Cramer-Rao lower bound. We further demonstrate a Bernstein-von-Mises result for our second estimator indicating that it's posterior is asymptotically Gaussian. In addition, the robustness properties of classical minimum Hellinger distance estimators continue to hold.

Original languageAmerican English
JournalEntropy
Volume20
DOIs
StatePublished - Dec 11 2018

Disciplines

  • Physical Sciences and Mathematics
  • Mathematics

Cite this