关注
Jianhao Ma
Jianhao Ma
在 umich.edu 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Global convergence of sub-gradient method for robust matrix recovery: Small initialization, noisy measurements, and over-parameterization
J Ma, S Fattahi
Journal of Machine Learning Research, 2022
312022
Sign-RIP: A robust restricted isometry property for low-rank matrix recovery
J Ma, S Fattahi
2021 NeurIPS Workshop on Optimization for Machine Learning, 2021
18*2021
Blessing of Depth in Linear Regression: Deeper Models Have Flatter Landscape Around the True Solution
J Ma, S Fattahi
Advances in Neural Information Processing Systems, 2022
12*2022
Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition
J Ma, L Guo, S Fattahi
ICLR 2023, 2022
82022
Can Learning Be Explained By Local Optimality In Low-rank Matrix Recovery?
J Ma, S Fattahi
arXiv preprint arXiv:2302.10963, 2023
4*2023
Towards Understanding Generalization via Decomposing Excess Risk Dynamics
J Teng, J Ma, Y Yuan
ICLR 2022, 2021
42021
Convergence of Gradient Descent with Small Initialization for Unregularized Matrix Completion
J Ma, S Fattahi
arXiv preprint arXiv:2402.06756, 2024
12024
Robust Sparse Mean Estimation via Incremental Learning
J Ma, RR Chen, Y He, S Fattahi, W Hu
arXiv preprint arXiv:2305.15276, 2023
2023
系统目前无法执行此操作,请稍后再试。
文章 1–8