Penalized principal logistic regression for sparse sufficient dimension reduction
SJ Shin, A Artemiou - Computational Statistics & Data Analysis, 2017 - Elsevier
Computational Statistics & Data Analysis, 2017•Elsevier
Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of
predictors by finding the central subspace, a minimal subspace of predictors that preserves
all the regression information. When predictor dimension is large, it is often assumed that
only a small number of predictors is informative. In this regard, sparse SDR is desired to
achieve variable selection and dimension reduction simultaneously. We propose a principal
logistic regression (PLR) as a new SDR tool and further develop its penalized version for …
predictors by finding the central subspace, a minimal subspace of predictors that preserves
all the regression information. When predictor dimension is large, it is often assumed that
only a small number of predictors is informative. In this regard, sparse SDR is desired to
achieve variable selection and dimension reduction simultaneously. We propose a principal
logistic regression (PLR) as a new SDR tool and further develop its penalized version for …
Abstract
Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果