This is a repository to help all readers who are interested in learning universal representations of time series with deep learning. If your papers are missing or you have other requests, please post an issue, create a pull request, or contact patara.t@kaist.ac.kr. We will update this repository at a regular basis in accordance with the top-tier conference publication cycles to maintain up-to-date.
Next Batch: AAAI 2025, WWW 2025, ICLR 2025, IJCAI 2025, ICDM 2025, ICDE 2025, CIKM 2025, KDD 2025, ICML 2025, NeurIPS 2025
Accompanying Paper: Universal Time-Series Representation Learning: A Survey
@article{trirat2024universal,
title={Universal Time-Series Representation Learning: A Survey},
author={Patara Trirat and Yooju Shin and Junhyeok Kang and Youngeun Nam and Jihye Na and Minyoung Bae and Joeun Kim and Byunghyun Kim and Jae-Gil Lee},
journal={arXiv preprint arXiv:2401.03717},
year={2024}
}
This group presents the methods that focus on finding a new way to enhance the usefulness of the training data at hand. These approaches prioritize engineering the data itself rather than focusing on model architecture and loss function design to capture the underlying patterns, trends, and relevant features within the time series. As in the figure, we categorize these data-centric approaches into two groups based on their objectives: improving data quality or increasing data quantity.
As neural architectures play a crucial role in the quality of representations, this group examines novel network architecture designs aimed at enhancing representation learning. These improvements (depicted in the figure) include, for example, better temporal modeling, handling missing values and irregularities, and extracting inter-variable dependencies in multivariate time series.
Studies in this category center on devising novel learning objective functions for the representation learning process, i.e., model (pre-)training. As in the figure, these studies can be classified into three groups based on the learning objectives: task-adaptive, non-contrasting, and contrasting losses.
- https://github.com/qingsongedu/awesome-AI-for-time-series-papers
- https://github.com/qianlima-lab/time-series-ptms
- https://github.com/qingsongedu/time-series-transformers-review
- https://github.com/lixus7/Time-Series-Works-Conferences
- https://github.com/qingsongedu/Awesome-TimeSeries-AIOps-LM-LLM
- https://github.com/qingsongedu/Awesome-SSL4TS