An Eigenvector Perturbation Bound and Its Application

J Fan, W Wang, Y Zhong - Journal of Machine Learning Research, 2018 - jmlr.org
In statistics and machine learning, we are interested in the eigenvectors (or singular vectors)
of certain matrices (eg covariance matrices, data matrices, etc). However, those matrices are …

An overview of large‐dimensional covariance and precision matrix estimators with applications in chemometrics

J Engel, L Buydens, L Blanchet - Journal of chemometrics, 2017 - Wiley Online Library
The covariance matrix (or its inverse, the precision matrix) is central to many chemometric
techniques. Traditional sample estimators perform poorly for high‐dimensional data such as …

[HTML][HTML] Large covariance estimation through elliptical factor models

J Fan, H Liu, W Wang - Annals of statistics, 2018 - ncbi.nlm.nih.gov
We propose a general Principal Orthogonal complEment Thresholding (POET) framework
for large-scale covariance matrix estimation based on the approximate factor model. A set of …

Robust covariance estimation for approximate factor models

J Fan, W Wang, Y Zhong - Journal of econometrics, 2019 - Elsevier
In this paper, we study robust covariance estimation under the approximate factor model
with observed factors. We propose a novel framework to first estimate the initial joint …

Distributed estimation for principal component analysis: An enlarged eigenspace analysis

X Chen, JD Lee, H Li, Y Yang - Journal of the American Statistical …, 2022 - Taylor & Francis
The growing size of modern datasets brings many challenges to the existing statistical
estimation approaches, which calls for new distributed methodologies. This article studies …

A survey of high dimension low sample size asymptotics

M Aoshima, D Shen, H Shen, K Yata… - Australian & New …, 2018 - Wiley Online Library
Peter Hall's work illuminated many aspects of statistical thought, some of which are very well
known including the bootstrap and smoothing. However, he also explored many other lesser …

De-biased sparse PCA: Inference for eigenstructure of large covariance matrices

J Janková, S van de Geer - IEEE Transactions on Information …, 2021 - ieeexplore.ieee.org
Sparse principal component analysis has become one of the most widely used techniques
for dimensionality reduction in high-dimensional datasets. While many methods are …

Tensor Principal Component Analysis

A Babii, E Ghysels, J Pan - arXiv preprint arXiv:2212.12981, 2022 - arxiv.org
In this paper, we develop new methods for analyzing high-dimensional tensor datasets. A
tensor factor model describes a high-dimensional dataset as a sum of a low-rank component …

[HTML][HTML] Decomposition-based correlation learning for multi-modal MRI-based classification of neuropsychiatric disorders

L Liu, J Chang, Y Wang, G Liang, YP Wang… - Frontiers in …, 2022 - frontiersin.org
Multi-modal magnetic resonance imaging (MRI) is widely used for diagnosing brain disease
in clinical practice. However, the high-dimensionality of MRI images is challenging when …

On Gaussian comparison inequality and its application to spectral analysis of large random matrices

F Han, S Xu, WX Zhou - 2018 - projecteuclid.org
Abstract Recently, Chernozhukov, Chetverikov, and Kato (Ann. Statist. 42 (2014) 1564–
1597) developed a new Gaussian comparison inequality for approximating the suprema of …