The probability flow ode is provably fast

S Chen, S Chewi, H Lee, Y Li, J Lu… - Advances in Neural …, 2024 - proceedings.neurips.cc
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …

Deep learning for multivariate time series imputation: A survey

J Wang, W Du, W Cao, K Zhang, W Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
The ubiquitous missing values cause the multivariate time series data to be partially
observed, destroying the integrity of time series and hindering the effective time series data …

Tsi-bench: Benchmarking time series imputation

W Du, J Wang, L Qian, Y Yang, Z Ibrahim, F Liu… - arXiv preprint arXiv …, 2024 - arxiv.org
Effective imputation is a crucial preprocessing step for time series analysis. Despite the
development of numerous deep learning algorithms for time series imputation, the …

Deep momentum multi-marginal Schrödinger bridge

T Chen, GH Liu, M Tao… - Advances in Neural …, 2024 - proceedings.neurips.cc
It is a crucial challenge to reconstruct population dynamics using unlabeled samples from
distributions at coarse time intervals. Recent approaches such as flow-based models or …

A Survey of AIOps for Failure Management in the Era of Large Language Models

L Zhang, T Jia, M Jia, Y Wu, A Liu, Y Yang, Z Wu… - arXiv preprint arXiv …, 2024 - arxiv.org
As software systems grow increasingly intricate, Artificial Intelligence for IT Operations
(AIOps) methods have been widely used in software system failure management to ensure …

Reflected Schr\" odinger Bridge for Constrained Generative Modeling

W Deng, Y Chen, NT Yang, H Du, Q Feng… - arXiv preprint arXiv …, 2024 - arxiv.org
Diffusion models have become the go-to method for large-scale generative models in real-
world applications. These applications often involve data distributions confined within …

Units: Building a unified time series model

S Gao, T Koker, O Queen, T Hartvigsen… - arXiv preprint arXiv …, 2024 - arxiv.org
Foundation models, especially LLMs, are profoundly transforming deep learning. Instead of
training many task-specific models, we can adapt a single pretrained model to many tasks …

UniTS: A unified multi-task time series model

S Gao, T Koker, O Queen, T Hartvigsen… - The Thirty-eighth …, 2024 - openreview.net
Although pre-trained transformers and reprogrammed text-based LLMs have shown strong
performance on time series tasks, the best-performing architectures vary widely across …

DiffImp: Efficient Diffusion Model for Probabilistic Time Series Imputation with Bidirectional Mamba Backbone

H Gao, W Shen, X Qiu, R Xu, J Hu, B Yang - arXiv preprint arXiv …, 2024 - arxiv.org
Probabilistic time series imputation has been widely applied in real-world scenarios due to
its ability to estimate uncertainty of imputation results. Meanwhile, denoising diffusion …

MTSCI: A Conditional Diffusion Model for Multivariate Time Series Consistent Imputation

J Zhou, J Li, G Zheng, X Wang, C Zhou - Proceedings of the 33rd ACM …, 2024 - dl.acm.org
Missing values are prevalent in multivariate time series, compromising the integrity of
analyses and degrading the performance of downstream tasks. Consequently, research has …