Low Information Omnibus (LIO) Priors for Dirichlet Process Mixture Models

Publication Type
Journal Article
Year of Publication
2019
Authors
Shi, Yushu; Martens, Michael; Banerjee, Anjishnu; Laud, Purushottam
Secondary
Bayesian Analysis
Volume
14
Pagination
677-702
Date Published
June 11
Keywords
BMT; Dirichlet; LIO; Mixture; Models; Omnibus; Statistical
Abstract
Dirichlet process mixture (DPM) models provide flexible modeling for distributions of data as an infinite mixture of distributions from a chosen collection. Specifying priors for these models in individual data contexts can be challenging. In this paper, we introduce a scheme which requires the investigator to specify only simple scaling information. This is used to transform the data to a fixed scale on which a low information prior is constructed. Samples from the posterior with the rescaled data are transformed back for inference on the original scale. The low information prior is selected to provide a wide variety of components for the DPM to generate flexible distributions for the data on the fixed scale. The method can be applied to all DPM models with kernel functions closed under a suitable scaling transformation. Construction of the low information prior, however, is kernel dependent. Using DPM-of-Gaussians and DPM-of-Weibulls models as examples, we show that the method provides accurate estimates of a diverse collection of distributions that includes skewed, multimodal, and highly dispersed members. With the recommended priors, repeated data simulations show performance comparable to that of standard empirical estimates. Finally, we show weak convergence of posteriors with the proposed priors for both kernels considered.