Highlights
- Pro
Stars
Train Large Language Models on MLX.
Towards Human-Friendly, Fast Learning and Adaptable Agent Communities
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Unified Training of Universal Time Series Forecasting Transformers
[EMNLP 2023]Context Compression for Auto-regressive Transformers with Sentinel Tokens
Unofficial implementation of RealFill
Automatic Speech Recognition with Speaker Diarization based on OpenAI Whisper
🦦 Otter, a multi-modal model based on OpenFlamingo (open-sourced version of DeepMind's Flamingo), trained on MIMIC-IT and showcasing improved instruction-following and in-context learning ability.
llama.cpp with BakLLaVA model describes what does it see
Run evaluation on LLMs using human-eval benchmark
Easy-to-use headless React Hooks to run LLMs in the browser with WebGPU. Just useLLM().
Shepherd: A foundational framework enabling federated instruction tuning for large language models
Inference code and configs for the ReplitLM model family
[ICLR 2024] Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Pluggable auth infrastructure for Web3 wallets and dapps
Tools that allow applications to interact with the Torus Network. For integration, look at docs.web3auth.io
Simple infrastructure that enables Web3 wallets and applications to provide seamless user logins for both mainstream and Web3.0 users.
Updates a AWS cloudfront distribution
Whitelabel, design and own the full UI/UX with Self-host Web3Auth (tKey). All of the power of threshold key management at your fingertips
Rust implementation of {t,n}-threshold ECDSA (elliptic curve digital signature algorithm).
Torus nodes run a Distributed Key Generation protocol amongst themselves that allows for the generation, storage and assignment of cryptographic keys