Change the repository type filter
All
Repositories list
446 repositories
data-prep-kit
PublicLLMs-from-scratch
PublicRWKV-LM
PublicRWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.lm-evaluation-harness
PublicEasyLM
PublicChatRWKV
Publicllama2.py
Publicllama-mistral
Publiczipline-reloaded
Publicbook-dp1
Publicsmart-chatbot-ui
Publicspark-nlp-workshop
PublicDeepTime
PublicFasterTransformer
Publicgoogle-research
Publicnbdev
Publicnuxt
Publictensorflow2
Publict5-experiments
PublicRWKV-LM-LoRA
PublicRWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.d3
PublicPaddleFleetX
Publicdtc-workshop
PublicH3
Publicdeepmind-research
Publicriak
Publicpytorch-Deep-Learning
Publicaerospike-server
Publicpytorch
Public