8000 GitHub - nyuolab/Gateformer: PyTorch implementation of Gateformer: Advancing Multivariate Time Series Forecasting through Temporal and Variate-Wise Attention with Gated Representations.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

PyTorch implementation of Gateformer: Advancing Multivariate Time Series Forecasting through Temporal and Variate-Wise Attention with Gated Representations.

Notifications You must be signed in to change notification settings

nyuolab/Gateformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gateformer: Advancing Multivariate Time Series Forecasting through Temporal and Variate-Wise Attention with Gated Representations

We provide the PyTorch implementation of Gateformer, a Transformer-based model for multivariate time series forecasting that integrates temporal (cross-time) and variate-wise (cross-variate) attention with gated representations. The full paper can be found at arXiv.

Architecture

The model encodes each variate’s series independently through two distinct pathways to obtain variate-wise representations: (1) temporal dependency embeddings that capture cross-time dependencies through patching and temporalwise attention, and (2) global temporal embeddings that encode global temporal patterns through an MLP. These complementary embeddings are integrated through a gating operation to form variate embeddings, which serve as input for cross-variate dependency modeling. Variate-wise attention is then applied on variate embeddings to model cross-variate dependencies, producing variate-interacting embeddings. Finally, a copy of the variate embeddings (without interaction) is combined with the variate-interacting embeddings through gating to regulate cross-variate correlations. The resulting output is then passed through a projection layer to generate predictions.

Result of Multivariate Long-term Forecasting

We evaluate Gateformer against competitive baseline models on widely recognized benchmark datasets, using MSE and MAE as evaluation metrics where lower values indicate better performance.

Visualization

Performance Boosting using our framework

Our proposed framework seamlessly integrates with Transformer-based and LLM-based models, delivering up to 20.7% performance improvement compared to the original models.

Visualization

∗ denotes models integrated with our framework.

Representation Learning

We evaluate the representations learned by our model, focusing on their generalization and transferability across datasets. A key strength of our method lies in its ability to effectively capture cross-variate correlations, enabling superior generalization and transferability to unseen datasets.

Usage

1. Install Dependencies

Install Python 3.11, Pytorch and other necessary dependencies.

pip install -r requirements.txt

2. Download Dataset

Download and place the datasets in the ./dataset/ folder. All benchmark datasets can be obtained from the public GitHub repository: iTransformer

Train and Evaluate the Model

We provide experiment scripts for Gateformer across all benchmark datasets in the ./scripts/ directory.

To run an experiment, execute the corresponding script.
Example:

bash ./scripts/multivariate_forecasting/Electricity/Gateformer.sh
bash ./scripts/multivariate_forecasting/Traffic/Gateformer.sh
bash ./scripts/multivariate_forecasting/Weather/Gateformer.sh
bash ./scripts/multivariate_forecasting/SolarEnergy/Gateformer.sh

About

PyTorch implementation of Gateformer: Advancing Multivariate Time Series Forecasting through Temporal and Variate-Wise Attention with Gated Representations.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0