-
-
transformers Public
Forked from huggingface/transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Python Apache License 2.0 UpdatedJun 16, 2025 -
-
-
fast-hadamard-transform Public
Forked from Dao-AILab/fast-hadamard-transformFast Hadamard transform in CUDA, with a PyTorch interface
-
flash-attention-w-tree-attn Public
Forked from Dao-AILab/flash-attentionFast and memory-efficient exact attention
-
nano-dpo Public
A minimal implementation of Direct Preference Optimization (DPO) in Chinese
-
nano-patch-sequence-pack Public
Just a few lines to combine 🤗 Transformers, Flash Attention 2, and torch.compile — simple, clean, fast ⚡
-