8000 feat: Add support for flash attention converter by gs-olive · Pull Request #2560 · pytorch/TensorRT · GitHub 8000
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

feat: Add support for flash attention converter#2560

Merged
gs-olive merged 1 commit intopytorch:mainfrom
gs-olive:scaled_dot_product_attention_converter
Jan 9, 2024
Merged

feat: Add support for flash attention converter#2560
gs-olive merged 1 commit intopytorch:mainfrom
gs-olive:scaled_dot_product_attention_converter

Commits

Commits on Dec 27, 2023

0