- San Francisco, CA
Stars
A Research UNIX V2 beta from 1972 brought back to life
Sources for the book "Machine Learning in Production"
Simple, unified interface to multiple Generative AI providers
The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.
An annotated implementation of the Transformer paper.
Distribute and run LLMs with a single file.
yeison / modular
Forked from modular/modularThe Mojo Programming Language
The simplest, fastest repository for training/finetuning medium-sized GPTs.
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
Implementation of Reinforcement Learning Algorithms. Python, OpenAI Gym, Tensorflow. Exercises and Solutions to accompany Sutton's Book and David Silver's course.
Use any linux distribution inside your terminal. Enable both backward and forward compatibility with software and freedom to use whatever distribution you’re more comfortable with. Mirror available…
Awesome-LLM: a curated list of Large Language Model
Open source code for AlphaFold 2.
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Port of OpenAI's Whisper model in C/C++
antimatter15 / alpaca.cpp
Forked from ggml-org/llama.cppLocally run an Instruction-Tuned Chat-Style LLM
Running large language models on a single GPU for throughput-oriented scenarios.
PyGWalker: Turn your dataframe into an interactive UI for visual analysis
A playbook for systematically maximizing the performance of deep learning models.
Relax! Flux is the ML library that doesn't make you tensor