Review Articles

On the non-local priors for sparsity selection in high-dimensional Gaussian DAG models

Xuan Cao ,

Division of Statistics and Data Science, Department of Mathematical Sciences, University of Cincinnati, Cincinnati, OH, USA

Fang Yang

Division of Statistics and Data Science, Department of Mathematical Sciences, University of Cincinnati, Cincinnati, OH, USA

Pages 332-345 | Received 27 May. 2020, Accepted 05 Jun. 2021, Published online: 05 Sep. 2021,
• Abstract
• References
• Citations

We consider sparsity selection for the Cholesky factor L of the inverse covariance matrix in

high-dimensional Gaussian DAG models. The sparsity is induced over the space of L via nonlocal priors, namely the product moment (pMOM) prior [Johnson, V., & Rossell, D. (2012).Bayesian model selection in high-dimensional settings. Journal of the American Statistical Association, 107(498), 649–660. https://doi.org/10.1080/01621459.2012.682536] and the hierarchical hyper-pMOM prior [Cao, X., Khare, K., & Ghosh, M. (2020). High-dimensional posteriorconsistency for hierarchical non-local priors in regression. Bayesian Analysis, 15(1), 241–262.https://doi.org/10.1214/19-BA1154]. We establish model selection consistency for Cholesky factor under more relaxed conditions compared to those in the literature and implement an efficientMCMC algorithm for parallel selecting the sparsity pattern for each column of L. We demonstrate the validity of our theoretical results via numerical simulations, and also use further simulations to demonstrate that our sparsity selection approach is competitive with existing methods.

References

• Altamore, D., Consonni, G., & La Rocca, L. (2013). Objective Bayesian search of gaussian directed acyclic graphical models for ordered variables with non-local priors. Biometrics, 69(2), 478487. https://doi.org/10.1111/biom.v69.2
• Banerjee, S., & Ghosal, S. (2014). Posterior convergence rates for estimating large precision matrices using graphical models. Electronic Journal of Statistics, 8(2), 21112137. https://doi.org/10.1214/14-EJS945 [Crossref]
• Banerjee, S., & Ghosal, S. (2015). Bayesian structure learning in graphical models. Journal of Multivariate Analysis, 136, 147162. https://doi.org/10.1016/j.jmva.2015.01.015
• Ben-David, E., Li, T., Massam, H., & Rajaratnam, B. (2016). High dimensional Bayesian inference for Gaussian directed acyclic graph models (Tech. Rep.). http://arxiv.org/abs/1109.4371 [Crossref]
• Bhadra, A., & Mallick, B. (2013). Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis. Biometrics, 69(2), 447457. https://doi.org/10.1111/biom.v69.2
• Bickel, P. J., & Levina, E. (2008). Regularized estimation of large covariance matrices. Annals of Statistics, 36(1), 199227. https://doi.org/10.1214/009053607000000758
• Cai, T., Liu, W., & Luo, X. (2011). A constrained minimization approach to sparse precision matrix estimation. Journal of the American Statistical Association, 106(494), 594607. https://doi.org/10.1198/jasa.2011.tm10155
• Cao, X., Khare, K., & Ghosh, M. (2019). Posterior graph selection and estimation consistency for high-dimensional Bayesian DAG models. Annals of Statistics, 47(1), 319348. https://doi.org/10.1214/18-AOS1689
• Cao, X., Khare, K., & Ghosh, M. (2020). High-dimensional posterior consistency for hierarchical non-local priors in regression. Bayesian Analysis, 15(1), 241262. https://doi.org/10.1214/19-BA1154
• Carvalho, C. M., & Scott, J. G. (2009). Objective Bayesian model selection in Gaussian graphical models. Biometrika, 96(3), 497512. https://doi.org/10.1093/biomet/asp017
• El Karoui, N. (2008). Spectrum estimation for large dimensional covariance matrices using random matrix theory. Annals of Statistics, 36(6), 27572790. https://doi.org/10.1214/07-AOS581
• Huang, J., Liu, N., Pourahmadi, M., & Liu, L. (2006). Covariance selection and estimation via penalised normal likelihood. Biometrika, 93(1), 8598. https://doi.org/10.1093/biomet/93.1.85
• Johnson, V., & Rossell, D. (2010). On the use of non-local prior densities in Bayesian hvoothesis tests hypothesis. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(2), 143170. https://doi.org/10.1111/rssb.2010.72.issue-2 [Crossref]
• Johnson, V., & Rossell, D. (2012). Bayesian model selection in high-dimensional settings. Journal of the American Statistical Association, 107(498), 649660. https://doi.org/10.1080/01621459.2012.682536
• Khare, K., Oh, S., Rahman, S., & Rajaratnam, B. (2017). A convex framework for high-dimensional sparse Cholesky based covariance estimation in Gaussian DAG models [Preprint, Department of Statisics, University of Florida]. [Google Scholar]
• Lee, K., Lee, J., & Lin, L. (2019). Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors. Annals of Statistics, 47(6), 34133437. https://doi.org/10.1214/18-AOS1783
• Liang, F., Paulo, R., Molina, G., Clyde, A. M., & Berger, O. J. (2008). Mixtures of g priors for Bayesian variable selection. Journal of the American Statistical Association, 103(481), 410423. https://doi.org/10.1198/016214507000001337
• Narisetty, N., & He, X. (2014). Bayesian variable selection with shrinking and diffusing priors. Annals of Statistics, 42(2), 789817. https://doi.org/10.1214/14-AOS1207
• Niu, Y., Pati, D., & Mallick, B. (2019). Bayesian graph selection consistency under model misspecification. arxiv:1901.04134 [Google Scholar]
• Pourahmadi, M. (2007). Cholesky decompositions and estimation of a covariance matrix: Orthogonality of variance–correlation parameters. Biometrika, 94(4), 10061013. https://doi.org/10.1093/biomet/asm073
• Rossell, D., Telesca, D., & Johnson, V. E. (2013). High-dimensional Bayesian classifiers using non-local priors. In Statistical models for data analysis. Springer. [Crossref]
• Scott, J. G., & Carvalho, C. M. (2008). Feature-inclusion stochastic search for gaussian graphical models. Journal of Computational and Graphical Statistics, 17(4), 790808. https://doi.org/10.1198/106186008X382683
• Shin, M., Bhattacharya, A., & Johnson, V. (2018). Scalable Bayesian variable selection using nonlocal prior densities in ultrahigh-dimensional settings. Statistica Sinica, 28(2), 10531078. https://doi.org/10.5705/ss.202016.0167 [Web of Science ®]
• Shojaie, A., & Michailidis, G. (2010). Penalized likelihood methods for estimation of sparse high-dimensional directed acyclic graphs. Biometrika, 97(3), 519538. https://doi.org/10.1093/biomet/asq038
• Tan, L. S. L., Jasra, A., De Iorio, M., & Ebbels, T. M. D. (2017). Bayesian inference for multiple gaussian graphical models with application to metabolic association networks. The Annals of Applied Statistics, 11(4), 22222251. https://doi.org/10.1214/17-AOAS1076
• van de Geer, S., & Bühlmann, P. (2013). ${\ell }_{0}$-penalized maximum likelihood for sparse directed acyclic graphs. The Annals of Statistics, 41(2), 536567. https://doi.org/10.1214/13-AOS1085
• Wu, H.-H. (2016). Nonlocal priors for Bayesian variable selection in generalized linear models and generalized linear mixed models and their applications in biology data [PhD thesis, University of Missouri]. [Crossref]
• Xiang, R., Khare, K., & Ghosh, M. (2015). High dimensional posterior convergence rates for decomposable graphical models. Electronic Journal of Statistics, 9(2), 28282854. https://doi.org/10.1214/15-EJS1084
• Yang, Y., Wainwright, M. J., & Jordan, M. I. (2016). On the computational complexity of high-dimensional Bayesian variable selection. Annals of Statistics, 44(6), 24972532. https://doi.org/10.1214/15-AOS1417
• Yu, G., & Bien, J. (2017). Learning local dependence in ordered data. Journal of Machine Learning Research, 18(42), 160[Google Scholar]
• Zhang, T., & Zou, H. (2014). Sparse precision matrix estimation via lasso penalized D-trace loss. Biometrika, 101(1), 103120. https://doi.org/10.1093/biomet/ast059

To cite this article: Xuan Cao & Fang Yang (2021) On the non-local priors for sparsity selection inhigh-dimensional Gaussian DAG models, Statistical Theory and Related Fields, 5:4, 332-345, DOI:10.1080/24754269.2021.1963182