Review Articles

Dimension reduction with expectation of conditional difference measure

Wenhui Sheng ,

Department of Mathematical and Statistical Sciences, Marquette University, Milwaukee, WI, USA

Qingcong Yuan

Biostatistics and Programming, Sanofi US, Bridgewater, NJ, USA

Pages | Received 26 Sep. 2022, Accepted 14 Feb. 2023, Published online: 13 Mar. 2023,
  • Abstract
  • Full Article
  • References
  • Citations

In this article, we introduce a flexible model-free approach to sufficient dimension reduction analysis using the expectation of conditional difference measure. Without any strict conditions, such as linearity condition or constant covariance condition, the method estimates the central subspace exhaustively and efficiently under linear or nonlinear relationships between response and predictors. The method is especially meaningful when the response is categorical. We also studied the n½-consistency and asymptotic normality of the estimate. The efficacy of our method is demonstrated through both simulations and a real data analysis.

  • Bellman, R. (1961). Adaptive control processes. Princeton University Press. 
  • Byrd, R. H., Gilbert, J. C., & Nocedal, J. (2000). A trust region method based on interior point techniques for nonlinear programming. Mathematical Programming89(1), 149–185. 
  • Byrd, R. H., Mary, E. H., & Nocedal, J. (1999). An interior point algorithm for large-scale nonlinear programming. SIAM Journal on Optimization9(4), 877–900. 
  • Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. Proc. Phys. Eng. Sci. Sect. (pp. 18–25). 
  • Cook, R. D. (1996). Graphics for regressions with a binary response. Journal of the American Statistical Association91(435), 983–992.
  • Cook, R. D. (1998). Regression graphics: ideas for studying regressions through graphics. Wiley. 
  • Cook, R. D., & Forzani, L. (2009). Likelihood-Based sufficient dimension reduction. Journal of the American Statistical Association104(485), 197–208. 
  • Cook, R. D., & Ni, L. (2005). Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. Journal of the American Statistical Association100(470), 410–428. 
  • Cook, R. D., & Weisberg, S. (1991). Sliced inverse regression for dimension reduction: comment. Journal of the American Statistical Association86(414), 328–332. 
  • Cook, R. D., & Zhang, X. (2014). Fused estimators of the central subspace in sufficient dimension reduction. Journal of the American Statistical Association109(506), 815–827. 
  • Cui, X., Härdle, W., & Zhu, L. (2011). The EFM approach for single-index models. The Annals of Statistics12(3), 793–815. 
  • Dong, Y., & Li, B. (2010). Dimension reduction for non-elliptically distributed predictors: second-order methods. Biometrika97(2), 279–294. 
  • Fung, W., He, X., Liu, L., & Shi, P. (2002). Dimension reduction based on canonical correlation. Statistica Sinica12(4), 1093–1113. 
  • Härdle, W., & Stoker, T. (1989). Investigating smooth multiple regression by the method of average derivatives. Journal of the American Statistical Association84(408), 986–995.
  • Hristache, M., Juditsky, A., Polzehl, J., & Spokoiny, V. (2001). Structure adaptive approach for dimension reduction. The Annals of Statistics29(6), 1537–1811. 
  • Lehmann, E. L. (1999). Elements of large-sample theory. Springer-Verlag
  • Li, K.-C. (1991). Sliced inverse regression for dimension reduction. Journal of the American Statistical Association86(414), 316–327.
  • Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: another application of stein's lemma. Journal of the American Statistical Association87(420), 1025–1039. 
  • Li, B., & Wang, S. (2007). On directional regression for dimension reduction. Journal of American Statistical Association102(479), 997–1008. 
  • Li, L., & Yin, X. (2009). Longitudinal data analysis using sufficient dimension reduction method. Computational Statistics and Data Analysis53(12), 4106–4115. 
  • Li, B., Zha, H., & Chiaromonte, F. (2005). Contour regression: a general approach to dimension reduction. The Annals of Statistics33(4), 1580–1616. 
  • Luo, W., & Li, B. (2016). Combining eigenvalues and variation of eigenvectors for order determination. Biometrika103(4), 875–887. 
  • Luo, R., Wang, H., & Tsai, C. L. (2009). Contour projected dimension reduction. The Annals of Statistics37(6B), 3743–3778. 
  • Ma, Y., & Zhu, L. (2013). Efficient estimation in sufficient dimension reduction. The Annals of Statistics41(1), 250–268.
  • Powell, J., Stock, J., & Stoker, T. (1989). Semiparametric estimation of index coefficients. Econometrica: Journal of the Econometric Society57(6), 1403–1430. 
  • Serfling, R. J. (1980). Approximation theorems of mathematical statistics. Wiley.
  • Sheng, W., & Yin, X. (2013). Direction estimation in single-index models via distance covariance. Journal of Multivariate Analysis122, 148–161. 
  • Sheng, W., & Yin, X. (2016). Sufficient dimension reduction via distance covariance. Journal of Computational and Graphical Statistics25(1), 91–104. 
  • Sheng, W., & Yuan, Q. (2020). Sufficient dimension folding in regression via distance covariance for matrix-valued predictors. Statistical Analysis and Data Mining13(1), 71–82. 
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological)58(1), 267–288.
  • Waltz, R. A., Morales, J. L., & Orban, D. (2006). An interior algorithm for nonlinear optimization that combines line search and trust region steps. Mathematical Programming107(3), 391–408. 
  • Wang, H., & Xia, Y. (2008). Sliced regression for dimension reduction. Journal of the American Statistical Association103(482), 811–821. 
  • Wang, Q., Yin, X., & Critchley, F. (2015). Dimension reduction based on the hellinger integral. Biometrika102(1), 95–106.
  • Xia, Y. (2007). A constructive approach to the estimation of dimension reduction directions. The Annals of Statistics35(6), 2654–2690.
  • Xia, Y., Tong, H., Li, W. K., & Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. Journal of the Royal Statistical Society. Series B (Statistical Methodology)64(3), 363–410. 
  • Ye, Z., & Weiss, R. E. (2003). Using the bootstrap to select one of a new class of dimension reduction methods. Journal of the American Statistical Association98(464), 968–979. 
  • Yin, X., & Hilafu, H. (2015). Sequential sufficient dimension reduction for large p, small n problems. Journal of the Royal Statistical Society: Series B (Statistical Methodology)77(4), 879–892. 
  • Yin, X., Li, B., & Cook, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression. Journal of Multivariate Analysis99(8), 1733–1757.
  • Yin, X., & Yuan, Q. (2020). A new class of measures for testing independence. Statistica Sinica30(4), 2131–2154. 
  • Zeng, P., & Zhu, Y. (2010). An integral transform method for estimating the central mean and central subspace. Journal of Multivariate Analysis101(1), 271–290.
  • Zhu, L., & Fang, K. (1996). Asymptotics for kernel estimate of sliced inverse regression. The Annals of Statistics24(3), 1053–1068.
  • Zhu, Y., & Zeng, P. (2006). Fourier methods for estimating the central subspace and the central mean subspace in regression. Journal of the American Statistical Association101(476), 1638–1651. 

To cite this article: Wenhui Sheng & Qingcong Yuan (2023) Dimension reduction with expectation of conditional difference measure, Statistical Theory and Related Fields, 7:3, 188-201, DOI: 10.1080/24754269.2023.2182136

To link to this article: