【AREA-introduction】time series
Time series tasks, RNN LSTM and Transformers. (building)
catalog
Introduction
- time series task
Traditional methods
- prophet
- ARIMA
DeepLearning methods
- RNN and LSTM
- Transformers
- Vanilla Transformer
- Informer
- AutoFormer
- CrossFormer
- PatchTST
- FEDFormer
- TSAT
- iTransformerop
Spatial and Temporal
Introduction
time series task
A reliable and efficient representation of multivariate time series is crucial in various downstream machine learning tasks.
In multivariate time series forecasting, each variable depends on its historical values and there are inter-dependencies among variables as well.
Models have to be designed to capture both intra and inter relationships among the time series.
Traditional methods
overall
These method like ARIMA and ETS for example,have statistical interpretability while lack adaptability and are limited when trying to capture non-linear relationships.
prophet
ARIMA (autoregressive integrated moving average)
ETS (exponential smoothing)
shapelets
DeepLearning methods
RNN and LSTM:
RNN (Recurrent Neural Networks),is designed to process sequential data.
LSTM (Long Short-Term Memory) is designed to capture both the short and longterm patterns within sequential inputs and alleviate gradient vanishing and exploding problems in RNN models
Transformers:
Vanilla Transformer
Informer (AAAI 2020)
Autoformer (NeurIPS 2021)
CrossFormer
PatchTST
FEDformer (2022)
TSAT (2022)
this method is transformer with GNNiTransformer
iTransformer is a new method came out in 2023 which shows SOTA performance now.
Basicly this model adopted some slight changes based on vanilla Transformer to allow:
1.project the origin series into “Variate Token”,which means instead of including 1 step of all variates,the token now include all step of 1 variate.
Spatial and Temporal
- Traffic Transformer (GIS 2020)
Future
- Inductive Biases for Time Series Transformers
- Transformers and GNN for Time Series
- Pre-trained Transformers for Time Series
- Transformers with NAS for Time Series
cite
https://zhuanlan.zhihu.com/p/484044005
https://blog.csdn.net/weixin_36488653/category_11689003.html