Optimization for deep learning: An overview

RY Sun - Journal of the Operations Research Society of China, 2020 - Springer
Optimization is a critical component in deep learning. We think optimization for neural
networks is an interesting topic for theoretical research due to various reasons. First, its …

The global landscape of neural networks: An overview

R Sun, D Li, S Liang, T Ding… - IEEE Signal Processing …, 2020 - ieeexplore.ieee.org
One of the major concerns for neural network training is that the nonconvexity of the
associated loss functions may cause a bad landscape. The recent success of neural …

Optimization for deep learning: theory and algorithms

R Sun - arXiv preprint arXiv:1912.08957, 2019 - arxiv.org
When and why can a neural network be successfully trained? This article provides an
overview of optimization algorithms and theory for training neural networks. First, we discuss …

Deep learning-based channel estimation for doubly selective fading channels

Y Yang, F Gao, X Ma, S Zhang - IEEE Access, 2019 - ieeexplore.ieee.org
In this paper, online deep learning (DL)-based channel estimation algorithm for doubly
selective fading channels is proposed by employing the deep neural network (DNN). With …

What Happens after SGD Reaches Zero Loss?--A Mathematical Framework

Z Li, T Wang, S Arora - arXiv preprint arXiv:2110.06914, 2021 - arxiv.org
Understanding the implicit bias of Stochastic Gradient Descent (SGD) is one of the key
challenges in deep learning, especially for overparametrized models, where the local …

Deep transfer learning-based downlink channel prediction for FDD massive MIMO systems

Y Yang, F Gao, Z Zhong, B Ai… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Artificial intelligence (AI) based downlink channel state information (CSI) prediction for
frequency division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems …

Going beyond linear mode connectivity: The layerwise linear feature connectivity

Z Zhou, Y Yang, X Yang, J Yan… - Advances in Neural …, 2023 - proceedings.neurips.cc
Recent work has revealed many intriguing empirical phenomena in neural network training,
despite the poorly understood and highly complex loss landscapes and training dynamics …

Recent theoretical advances in non-convex optimization

M Danilova, P Dvurechensky, A Gasnikov… - … and Probability: With a …, 2022 - Springer
Motivated by recent increased interest in optimization algorithms for non-convex
optimization in application to training deep neural networks and other optimization problems …

On connected sublevel sets in deep learning

Q Nguyen - International conference on machine learning, 2019 - proceedings.mlr.press
This paper shows that every sublevel set of the loss function of a class of deep over-
parameterized neural nets with piecewise linear activation functions is connected and …

Explaining landscape connectivity of low-cost solutions for multilayer nets

R Kuditipudi, X Wang, H Lee, Y Zhang… - Advances in neural …, 2019 - proceedings.neurips.cc
Mode connectivity is a surprising phenomenon in the loss landscape of deep nets. Optima---
at least those discovered by gradient-based optimization---turn out to be connected by …