Publication | Open Access
General Bayesian updating and the loss-likelihood bootstrap
104
Citations
28
References
2019
Year
General Bayesian UpdatingBayesian StatisticEngineeringData ScienceWeighted Likelihood BootstrapUncertainty QuantificationApproximate Bayesian PosteriorStatistical InferenceStatisticsBayesian InferenceBayesian Hierarchical ModelingApproximate Bayesian Computation
In this paper we revisit the weighted likelihood bootstrap, a method that generates samples from an approximate Bayesian posterior of a parametric model. We show that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing an expected negative loglikelihood under an unknown sampling distribution. This interpretation enables us to extend the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss. We call this method the loss-likelihood bootstrap, and we make a connection between it and general Bayesian updating, which is a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function. The loss-likelihood bootstrap is used to calibrate the general Bayesian posterior by matching asymptotic Fisher information. We demonstrate the proposed method on a number of examples.
| Year | Citations | |
|---|---|---|
Page 1
Page 1