Otov2: Automatic, generic, user-friendly

T Chen, L Liang, T Ding, Z Zhu, I Zharkov - arXiv preprint arXiv …, 2023 - arxiv.org
The existing model compression methods via structured pruning typically require
complicated multi-stage procedures. Each individual stage necessitates numerous …

HESSO: Towards Automatic Efficient and User Friendly Any Neural Network Training and Pruning

T Chen, X Qu, D Aponte, C Banbury, J Ko… - arXiv preprint arXiv …, 2024 - arxiv.org
Structured pruning is one of the most popular approaches to effectively compress the heavy
deep neural networks (DNNs) into compact sub-networks while retaining performance. The …

OTOv3: Automatic Architecture-Agnostic Neural Network Training and Compression from Structured Pruning to Erasing Operators

T Chen, T Ding, Z Zhu, Z Chen, HT Wu… - arXiv preprint arXiv …, 2023 - arxiv.org
Compressing a predefined deep neural network (DNN) into a compact sub-network with
competitive performance is crucial in the efficient machine learning realm. This topic spans …

S3Editor: A Sparse Semantic-Disentangled Self-Training Framework for Face Video Editing

G Wang, T Chen, K Ghasedi, HT Wu, T Ding… - arXiv preprint arXiv …, 2024 - arxiv.org
Face attribute editing plays a pivotal role in various applications. However, existing methods
encounter challenges in achieving high-quality results while preserving identity, editing …

Learning k-Level Sparse Neural Networks Using a New Generalized Weighted Group Sparse Envelope Regularization

Y Refael, I Arbel, W Huleihel - arXiv preprint arXiv:2212.12921, 2022 - arxiv.org
We propose an efficient method to learn both unstructured and structured sparse neural
networks during training, utilizing a novel generalization of the sparse envelope function …

Structured Sparse Optimization

Y Dai - 2024 - search.proquest.com
In the age of high-dimensional data-driven science, sparse optimization techniques play a
vital role. Sparse optimization aims to discover solutions with compact representations in low …

Learning -Level Structured Sparse Neural Networks Using Group Envelope Regularization

Y Refael, I Arbel, W Huleihel - Transactions on Machine Learning Research - openreview.net
The extensive need for computational resources poses a significant obstacle to deploying
large-scale Deep Neural Networks (DNN) on devices with constrained resources. At the …

Learning Structured Sparse Neural Networks Using Group Envelope Regularization

Y Refael, I Arbel, W Huleihel - openreview.net
We propose an efficient method to learn both unstructured and structured sparse neural
networks during training, utilizing a novel generalization of the sparse envelope function …