8000 GitHub - adlnlp/re-temp
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

adlnlp/re-temp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion

Kunze Wang, Caren Han, Josiah Poon

This is the official implementation for Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion (EMNLP Findings 2023).

One Line Running

Simply run python main.py -d ICEWS14

Requirments:torch, dgl(cuda version)

Arguments explanation

Argument Default Description
--gpu 0 Set cuda device
--dataset -d ICEWS14 Dataset used here, please choose from GDELT, ICEWS05-15, ICEWS14, ICEWS14s, ICEWS18, WIKI
--dropout 0.2 Dropout rate
--n-hidden 200 Layer hidden dimension
--n-layers 2 Number of layers of one GNN
--history-len 3 history length
--lr 0.001 learning rate
--early_stop 5 early stop epochs
--easy_copy 0 Remove most print results, only keep the final output

Citation

@inproceedings{wang2023re,
  title={Re-Temp: Relation-Aware Temporal Representation Learning for Temporal Knowledge Graph Completion},
  author={Wang, Kunze and Han, Caren and Poon, Josiah},
  booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
  pages={258--269},
  year={2023}
}

Acknowledgement

Some of the codes are inspired by:

We express gratitude for all the previous contributions in this area.

If you can't replicate the results reported in the paper using the default hyperparameter, please consider updating your torch and dgl package version.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  
0