Towards out-of-distribution generalization: A survey

J Liu, Z Shen, Y He, X Zhang, R Xu, H Yu… - arXiv preprint arXiv …, 2021 - arxiv.org
Traditional machine learning paradigms are based on the assumption that both training and
test data follow the same statistical pattern, which is mathematically referred to as …

Diffusion model as representation learner

X Yang, X Wang - … of the IEEE/CVF International Conference …, 2023 - openaccess.thecvf.com
Abstract Diffusion Probabilistic Models (DPMs) have recently demonstrated impressive
results on various generative tasks. Despite its promises, the learned representations of pre …

Priority-centric human motion generation in discrete latent space

H Kong, K Gong, D Lian, MB Mi… - Proceedings of the …, 2023 - openaccess.thecvf.com
Text-to-motion generation is a formidable task, aiming to produce human motions that align
with the input text while also adhering to human capabilities and physical laws. While there …

GDA: Generalized Diffusion for Robust Test-time Adaptation

YY Tsai, FC Chen, AYC Chen, J Yang… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Machine learning models face generalization challenges when exposed to out-of-
distribution (OOD) samples with unforeseen distribution shifts. Recent research reveals that …

Generator born from classifier

R Yu, X Wang - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
In this paper, we make a bold attempt toward an ambitious task: given a pre-trained
classifier, we aim to reconstruct an image generator, without relying on any data samples …

Unveil conditional diffusion models with classifier-free guidance: A sharp statistical theory

H Fu, Z Yang, M Wang, M Chen - arXiv preprint arXiv:2403.11968, 2024 - arxiv.org
Conditional diffusion models serve as the foundation of modern image synthesis and find
extensive application in fields like computational biology and reinforcement learning. In …

Dispel: Domain generalization via domain-specific liberating

CY Chang, YN Chuang, G Wang, M Du… - arXiv preprint arXiv …, 2023 - arxiv.org
Domain generalization aims to learn a generalization model that can perform well on
unseen test domains by only training on limited source domains. However, existing domain …

Tackling Structural Hallucination in Image Translation with Local Diffusion

S Kim, C Jin, T Diethe, M Figini, HFJ Tregidgo… - arXiv preprint arXiv …, 2024 - arxiv.org
Recent developments in diffusion models have advanced conditioned image generation, yet
they struggle with reconstructing out-of-distribution (OOD) images, such as unseen tumors in …

[HTML][HTML] Investigation of out-of-distribution detection across various models and training methodologies

BC Kim, B Kim, Y Hyun - Neural Networks, 2024 - Elsevier
Abstract Machine learning-based algorithms demonstrate impressive performance across
numerous fields; however, they continue to suffer from certain limitations. Even sophisticated …

Diagnosing and Rectifying Fake OOD Invariance: A Restructured Causal Approach

Z Chen, Y Zheng, ZR Lai, Q Guan, L Lin - Proceedings of the AAAI …, 2024 - ojs.aaai.org
Invariant representation learning (IRL) encourages the prediction from invariant causal
features to labels deconfounded from the environments, advancing the technical roadmap of …