Stars
Package for defining computation graphs and performing intervention experiments
Relax! Flux is the ML library that doesn't make you tensor
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
A lightweight library for PyTorch training tools and utilities
The fundamental package for scientific computing with Python.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.
Deep universal probabilistic programming with Python and PyTorch
A translation of Melanie Mitchell's original Copycat project from Lisp to Python.