A comprehensive survey of dataset distillation
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …
become the primary choice in many application domains. This progress is mainly attributed …
Data Optimization in Deep Learning: A Survey
O Wu, R Yao - arXiv preprint arXiv:2310.16499, 2023 - arxiv.org
Large-scale, high-quality data are considered an essential factor for the successful
application of many deep learning techniques. Meanwhile, numerous real-world deep …
application of many deep learning techniques. Meanwhile, numerous real-world deep …
Dataset Distillation in Latent Space
Dataset distillation (DD) is a newly emerging research area aiming at alleviating the heavy
computational load in training models on large datasets. It tries to distill a large dataset into a …
computational load in training models on large datasets. It tries to distill a large dataset into a …
Embracing Unknown Step by Step: Towards Reliable Sparse Training in Real World
Sparse training has emerged as a promising method for resource-efficient deep neural
networks (DNNs) in real-world applications. However, the reliability of sparse models …
networks (DNNs) in real-world applications. However, the reliability of sparse models …
Dataset Distillation from First Principles: Integrating Core Information Extraction and Purposeful Learning
Dataset distillation (DD) is an increasingly important technique that focuses on constructing
a synthetic dataset capable of capturing the core information in training data to achieve …
a synthetic dataset capable of capturing the core information in training data to achieve …
Data-Efficient Generation for Dataset Distillation
While deep learning techniques have proven successful in image-related tasks, the
exponentially increased data storage and computation costs become a significant …
exponentially increased data storage and computation costs become a significant …
Generative Dataset Distillation Based on Diffusion Model
This paper presents our method for the generative track of The First Dataset Distillation
Challenge at ECCV 2024. Since the diffusion model has become the mainstay of generative …
Challenge at ECCV 2024. Since the diffusion model has become the mainstay of generative …
Revisit the Essence of Distilling Knowledge through Calibration
Knowledge Distillation (KD) has evolved into a practical technology for transferring
knowledge from a well-performing model (teacher) to a weak model (student). A counter …
knowledge from a well-performing model (teacher) to a weak model (student). A counter …
Multiclass Alignment of Confidences and Softened Target Occurrences for Train-time Calibration
V Kugathasan, H Zhou, Z Izzo, G Kuruppu, S Yoon… - openreview.net
In spite of delivering remarkable predictive accuracy across many domains, including
computer vision and medical imaging, Deep Neural Networks (DNNs) are susceptible to …
computer vision and medical imaging, Deep Neural Networks (DNNs) are susceptible to …
Calibration Bottleneck: What Makes Neural Networks less Calibratable?
DB Wang, ML Zhang - openreview.net
While modern deep neural networks have achieved remarkable success, they have
exhibited a notable deficiency in reliably estimating uncertainty. Many existing studies …
exhibited a notable deficiency in reliably estimating uncertainty. Many existing studies …