Unsupervised representation learning for time series: A review

Q Meng, H Qian, Y Liu, Y Xu, Z Shen, L Cui - arXiv preprint arXiv …, 2023 - arxiv.org
Unsupervised representation learning approaches aim to learn discriminative feature
representations from unlabeled data, without the requirement of annotating every sample …

SimMMDG: A simple and effective framework for multi-modal domain generalization

H Dong, I Nejjar, H Sun, E Chatzi… - Advances in Neural …, 2023 - proceedings.neurips.cc
In real-world scenarios, achieving domain generalization (DG) presents significant
challenges as models are required to generalize to unknown target distributions …

WOODS: Benchmarks for out-of-distribution generalization in time series

JC Gagnon-Audet, K Ahuja, MJ Darvishi-Bayazi… - arXiv preprint arXiv …, 2022 - arxiv.org
Machine learning models often fail to generalize well under distributional shifts.
Understanding and overcoming these failures have led to a research field of Out-of …

Domain-specific risk minimization for domain generalization

YF Zhang, J Wang, J Liang, Z Zhang, B Yu… - Proceedings of the 29th …, 2023 - dl.acm.org
Domain generalization (DG) approaches typically use the hypothesis learned on source
domains for inference on the unseen target domain. However, such a hypothesis can be …

[HTML][HTML] Deep generative domain adaptation with temporal relation attention mechanism for cross-user activity recognition

X Ye, I Kevin, K Wang - Pattern Recognition, 2024 - Elsevier
Abstract In sensor-based Human Activity Recognition (HAR), a predominant assumption is
that the data utilized for training and evaluation purposes are drawn from the same …

Units: Building a unified time series model

S Gao, T Koker, O Queen, T Hartvigsen… - arXiv preprint arXiv …, 2024 - arxiv.org
Foundation models, especially LLMs, are profoundly transforming deep learning. Instead of
training many task-specific models, we can adapt a single pretrained model to many tasks …

UniTS: A unified multi-task time series model

S Gao, T Koker, O Queen, T Hartvigsen… - The Thirty-eighth …, 2024 - openreview.net
Although pre-trained transformers and reprogrammed text-based LLMs have shown strong
performance on time series tasks, the best-performing architectures vary widely across …

Generalizable low-resource activity recognition with diverse and discriminative representation learning

X Qin, J Wang, S Ma, W Lu, Y Zhu, X Xie… - Proceedings of the 29th …, 2023 - dl.acm.org
Human activity recognition (HAR) is a time series classification task that focuses on
identifying the motion patterns from human sensor readings. Adequate data is essential but …

Self-Supervised Learning of Time Series Representation via Diffusion Process and Imputation-Interpolation-Forecasting Mask

Z Senane, L Cao, VL Buchner, Y Tashiro… - Proceedings of the 30th …, 2024 - dl.acm.org
Time Series Representation Learning (TSRL) focuses on generating informative
representations for various Time Series (TS) modeling tasks. Traditional Self-Supervised …

Continuous Invariance Learning

Y Lin, F Zhou, L Tan, L Ma, J Liu, Y He, Y Yuan… - arXiv preprint arXiv …, 2023 - arxiv.org
Invariance learning methods aim to learn invariant features in the hope that they generalize
under distributional shifts. Although many tasks are naturally characterized by continuous …