Stars
Kanana: Compute-efficient Bilingual Language Models
friendshipkim / Compact-Language-Models-via-Pruning-and-Knowledge-Distillation
Forked from alperiox/Compact-Language-Models-via-Pruning-and-Knowledge-DistillationUnofficial implementation of https://arxiv.org/pdf/2407.14679
A family of compressed models obtained via pruning and knowledge distillation
Official repository for "AM-RADIO: Reduce All Domains Into One"
A curated list for Efficient Large Language Models
Awesome-LLM: a curated list of Large Language Model
A list of AI coding tools (assistants, completions, refactoring, etc.)
A playbook for systematically maximizing the performance of deep learning models.
Adaptive Learning of Tensor Network Structures
Development repository for the Triton language and compiler
TensorLy-Torch: Deep Tensor Learning with TensorLy and PyTorch
Fast Block Sparse Matrices for Pytorch
An Aspiring Drop-In Replacement for NumPy at Scale
Investment Research for Everyone, Everywhere.
One second to read GitHub code with VS Code.
Rich is a Python library for rich text and beautiful formatting in the terminal.
A complete computer science study plan to become a software engineer.
rga: ripgrep, but also search in PDFs, E-Books, Office documents, zip, tar.gz, etc.
Meta package for the Regolith Desktop Environment
Collection of recent methods on (deep) neural network compression and acceleration.
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
Flax is a neural network library for JAX that is designed for flexibility.
Stock options, RSUs, taxes — read the latest edition: www.holloway.com/ec
A cheatsheet of modern C++ language and library features.
A curated list of neural network pruning resources.
A collection of various deep learning architectures, models, and tips