Review Articles

Rejoinder on ‘A review of distributed statistical inference’

Yuan Gao ,

School of Statistics and Key Laboratory of Advanced Theory and Application in Statistics and Data Science – MOE, East China Normal University, Shanghai, People's Republic of China

Weidong Liu ,

School of Mathematical Sciences – School of Life Sciences and Biotechnology – MOE Key Lab of Artifcial Intelligence, Shanghai Jiao Tong University, Shanghai, People's Republic of China

Hansheng Wang ,

Guanghua School of Management, Peking University, Beijing, People's Republic of China

Xiaozhou Wang ,

School of Statistics and Key Laboratory of Advanced Theory and Application in Statistics and Data Science – MOE, East China Normal University, Shanghai, People's Republic of China

Yibo Yan ,

School of Statistics and Key Laboratory of Advanced Theory and Application in Statistics and Data Science – MOE, East China Normal University, Shanghai, People's Republic of China

Riquan Zhang

School of Statistics and Key Laboratory of Advanced Theory and Application in Statistics and Data Science – MOE, East China Normal University, Shanghai, People's Republic of China

rqzhang@stat.ecnu.edu.cn

Pages 111-113 | Received 26 Nov. 2021, Accepted 10 Jan. 2022, Published online: 09 Feb. 2022,
  • Abstract
  • Full Article
  • References
  • Citations

References

  • Battey, H., Tan, K. M., & Zhou, W.-X. (2021). Communication-efficient distributed quantile regression with optimal statistical guarantees. Preprint. arXiv:2110.13113 
  • Cai, T. T., & Zhang, L. (2021). A convex optimization approach to high-dimensional sparse quadratic discriminant analysis. The Annals of Statistics49(3), 1537–1568. https://doi.org/10.1214/20-AOS2012 
  • Chen, X., Liu, W., & Zhang, Y. (2021). First-order Newton-type estimator for distributed estimation and inference. Journal of the American Statistical Association, 1–17. https://doi.org/10.1080/01621459.2021.1891925
  • Choromanska, A., Henaff, M., Mathieu, M., Arous, G. B., & LeCun, Y. (2015). The loss surfaces of multilayer networks. In Artificial intelligence and statistics (pp. 192–204). PMLR. http://proceedings.mlr.press/v38/choromanska15.pdf 
  • Du, B., & Zhao, J. (2021). Hypothesis testing of one-sample mean vector in distributed frameworks. Preprint. arXiv:2110.02588 
  • Jordan, M. I., Lee, J. D., & Yang, Y. (2018). Communication-efficient distributed statistical inference. Journal of the American Statistical Association, 114(526), 668–681. https://doi.org/10.1080/01621459.2018.1429274 
  • Lalitha, A., Shekhar, S., Javidi, T., & Koushanfar, F. (2018). Fully decentralized federated learning. In Third workshop on Bayesian deep learning (NeurIPS)http://bayesiandeeplearning.org/2018/papers/140.pdf 
  • Lian, H., Liu, J., & Fan, Z. (2021). Distributed learning for sketched kernel regression. Neural Networks143, 368–376. https://doi.org/10.1016/j.neunet.2021.06.020
  • Lin, S.-B., Wang, D., & Zhou, D.-X. (2020). Distributed kernel ridge regression with communications. Journal of Machine Learning Research21(93), 1–38. https://jmlr.org/papers/volume21/19-592/19-592.pdf 
  • McMahan, B., Moore, E., Ramage, D., & Hampson, S. (2017). Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics (pp. 1273–1282). PMLR. 
  • Ormándi, R., Hegedös, I., & Jelasity, M. (2013). Gossip learning with linear models on fully distributed data. Concurrency and Computation: Practice and Experience25(4), 556–571. https://doi.org/10.1002/cpe.v25.4 
  • Pan, R., Ren, T., Guo, B., Li, F., Li, G., & Wang, H. (2021). A note on distributed quantile regression by pilot sampling and one-step updating. Journal of Business & Economic Statistics, 1–10. https://doi.org/10.1080/07350015.2021.1961789 
  • Shamir, O., Srebro, N., & Zhang, T. (2014). Communication-efficient distributed optimization using an approximate newton-type method. In International conference on machine learning (pp. 1000–1008). PMLR. 
  • Shi, J., Qin, G., Zhu, H., & Zhu, Z. (2021). Communication-efficient distributed m-estimation with missing data. Computational Statistics & Data Analysis161, Article 107251. https://doi.org/10.1016/j.csda.2021.107251 
  • Sun, Z., & Lin, S.-B. (2020). Distributed learning with dependent samples. Preprint. arXiv:2002.03757 
  • Tang, H., Lian, X., Yan, M., Zhang, C., & Liu, J. (2018). D2: Decentralized training over decentralized data. In International conference on machine learning (pp. 4848–4856). PMLR. 
  • Wang, J., Kolar, M., Srebro, N., & Zhang, T. (2017). Efficient distributed learning with sparsity. In International conference on machine learning (pp. 3636–3645). PMLR. 
  • Wu, S., Li, Z., & Zhu, X. (2020). Distributed community detection for large scale networks using stochastic block model. Preprint. arXiv:2009.11747
  • Xu, C., Zhang, Y., Li, R., & Wu, X. (2016). On the feasibility of distributed kernel regression for big data. IEEE Transactions on Knowledge and Data Engineering28(11), 3041–3052. https://doi.org/10.1109/TKDE.2016.2594060
  • Yu, Y., Chao, S.-K., & Cheng, G. (2020). Simultaneous inference for massive data: Distributed bootstrap. In International conference on machine learning (pp. 10892–10901). PMLR. 
  • Yu, Y., Chao, S.-K., & Cheng, G. (2021). Distributed bootstrap for simultaneous inference under high dimensionality. Preprint. arXiv:2102.10080 
  • Zhou, X., Chang, L., Xu, P., & Lv, S. (2021). Communication-efficient byzantine-robust distributed learning with statistical guarantee. Preprint. arXiv:2103.00373 

To cite this article: Yuan Gao, Weidong Liu, Hansheng Wang, Xiaozhou Wang, Yibo Yan & Riquan Zhang (2022): Rejoinder on ‘A review of distributed statistical inference’, Statistical Theory and Related Fields, DOI: 10.1080/24754269.2022.2035304

To link to this article: https://doi.org/10.1080/24754269.2022.2035304