8000 GitHub - CyberZHG/torch-multi-head-attention: Multi-head attention in PyTorch
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
This repository was archived by the owner on Mar 3, 2024. It is now read-only.

CyberZHG/torch-multi-head-attention

Repository files navigation

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)

About

Multi-head attention in PyTorch

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0