Self-supervised learning for time series analysis: Taxonomy, progress, and prospects
Self-supervised learning (SSL) has recently achieved impressive performance on various
time series tasks. The most prominent advantage of SSL is that it reduces the dependence …
time series tasks. The most prominent advantage of SSL is that it reduces the dependence …
Deep learning for time series classification and extrinsic regression: A current survey
Time Series Classification and Extrinsic Regression are important and challenging machine
learning tasks. Deep learning has revolutionized natural language processing and computer …
learning tasks. Deep learning has revolutionized natural language processing and computer …
Self-supervised contrastive pre-training for time series via time-frequency consistency
Pre-training on time series poses a unique challenge due to the potential mismatch between
pre-training and target domains, such as shifts in temporal dynamics, fast-evolving trends …
pre-training and target domains, such as shifts in temporal dynamics, fast-evolving trends …
Ts2vec: Towards universal representation of time series
This paper presents TS2Vec, a universal framework for learning representations of time
series in an arbitrary semantic level. Unlike existing methods, TS2Vec performs contrastive …
series in an arbitrary semantic level. Unlike existing methods, TS2Vec performs contrastive …
Unsupervised time-series representation learning with iterative bilinear temporal-spectral fusion
Unsupervised/self-supervised time series representation learning is a challenging problem
because of its complex dynamics and sparse annotations. Existing works mainly adopt the …
because of its complex dynamics and sparse annotations. Existing works mainly adopt the …
Learning latent seasonal-trend representations for time series forecasting
Forecasting complex time series is ubiquitous and vital in a range of applications but
challenging. Recent advances endeavor to achieve progress by incorporating various deep …
challenging. Recent advances endeavor to achieve progress by incorporating various deep …
TEST: Text prototype aligned embedding to activate LLM's ability for time series
This work summarizes two ways to accomplish Time-Series (TS) tasks in today's Large
Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a …
Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a …
Tsmixer: Lightweight mlp-mixer model for multivariate time series forecasting
Transformers have gained popularity in time series forecasting for their ability to capture
long-sequence interactions. However, their memory and compute-intensive requirements …
long-sequence interactions. However, their memory and compute-intensive requirements …
Cocoa: Cross modality contrastive learning for sensor data
Self-Supervised Learning (SSL) is a new paradigm for learning discriminative
representations without labeled data, and has reached comparable or even state-of-the-art …
representations without labeled data, and has reached comparable or even state-of-the-art …
Assessing the state of self-supervised human activity recognition using wearables
The emergence of self-supervised learning in the field of wearables-based human activity
recognition (HAR) has opened up opportunities to tackle the most pressing challenges in the …
recognition (HAR) has opened up opportunities to tackle the most pressing challenges in the …