On the degrees of freedom in shape-restricted regression
M Meyer, M Woodroofe - The annals of Statistics, 2000 - projecteuclid.org
The annals of Statistics, 2000•projecteuclid.org
For the problem of estimating a regression function, $\mu $ say, subject to shape constraints,
like monotonicity or convexity, it is argued that the divergence of the maximum likelihood
estimator provides a useful measure of the effective dimension of the model. Inequalities are
derived for the expected mean squared error of the maximum likelihood estimator and the
expected residual sum of squares. These generalize equalities from the case of linear
regression. As an application, it is shown that the maximum likelihood estimator of the error …
like monotonicity or convexity, it is argued that the divergence of the maximum likelihood
estimator provides a useful measure of the effective dimension of the model. Inequalities are
derived for the expected mean squared error of the maximum likelihood estimator and the
expected residual sum of squares. These generalize equalities from the case of linear
regression. As an application, it is shown that the maximum likelihood estimator of the error …
For the problem of estimating a regression function, say, subject to shape constraints, like monotonicity or convexity, it is argued that the divergence of the maximum likelihood estimator provides a useful measure of the effective dimension of the model. Inequalities are derived for the expected mean squared error of the maximum likelihood estimator and the expected residual sum of squares. These generalize equalities from the case of linear regression. As an application, it is shown that the maximum likelihood estimator of the error variance is asymptotically normal with mean and variance . For monotone regression, it is shown that the maximum likelihood estimator of attains the optimal rate of convergence, and a bias correction to the maximum likelihood estimator of is derived.
Project Euclid
以上显示的是最相近的搜索结果。 查看全部搜索结果