Unsupervised representation learning for time series: A review
Unsupervised representation learning approaches aim to learn discriminative feature
representations from unlabeled data, without the requirement of annotating every sample …
representations from unlabeled data, without the requirement of annotating every sample …
SimMMDG: A simple and effective framework for multi-modal domain generalization
In real-world scenarios, achieving domain generalization (DG) presents significant
challenges as models are required to generalize to unknown target distributions …
challenges as models are required to generalize to unknown target distributions …
WOODS: Benchmarks for out-of-distribution generalization in time series
Machine learning models often fail to generalize well under distributional shifts.
Understanding and overcoming these failures have led to a research field of Out-of …
Understanding and overcoming these failures have led to a research field of Out-of …
Domain-specific risk minimization for domain generalization
Domain generalization (DG) approaches typically use the hypothesis learned on source
domains for inference on the unseen target domain. However, such a hypothesis can be …
domains for inference on the unseen target domain. However, such a hypothesis can be …
[HTML][HTML] Deep generative domain adaptation with temporal relation attention mechanism for cross-user activity recognition
Abstract In sensor-based Human Activity Recognition (HAR), a predominant assumption is
that the data utilized for training and evaluation purposes are drawn from the same …
that the data utilized for training and evaluation purposes are drawn from the same …
Units: Building a unified time series model
Foundation models, especially LLMs, are profoundly transforming deep learning. Instead of
training many task-specific models, we can adapt a single pretrained model to many tasks …
training many task-specific models, we can adapt a single pretrained model to many tasks …
UniTS: A unified multi-task time series model
Although pre-trained transformers and reprogrammed text-based LLMs have shown strong
performance on time series tasks, the best-performing architectures vary widely across …
performance on time series tasks, the best-performing architectures vary widely across …
Generalizable low-resource activity recognition with diverse and discriminative representation learning
Human activity recognition (HAR) is a time series classification task that focuses on
identifying the motion patterns from human sensor readings. Adequate data is essential but …
identifying the motion patterns from human sensor readings. Adequate data is essential but …
Self-Supervised Learning of Time Series Representation via Diffusion Process and Imputation-Interpolation-Forecasting Mask
Time Series Representation Learning (TSRL) focuses on generating informative
representations for various Time Series (TS) modeling tasks. Traditional Self-Supervised …
representations for various Time Series (TS) modeling tasks. Traditional Self-Supervised …
Continuous Invariance Learning
Invariance learning methods aim to learn invariant features in the hope that they generalize
under distributional shifts. Although many tasks are naturally characterized by continuous …
under distributional shifts. Although many tasks are naturally characterized by continuous …