8000 Andrei-Aksionov / Starred · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
View Andrei-Aksionov's full-sized avatar

Block or report Andrei-Aksionov

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Showing results

Yet Another Language Model: LLM inference in C++/CUDA, no libraries except for I/O

C++ 362 33 Updated Jan 15, 2025

👁️ An authorial set of fundamental Python recipes on Computer Vision and Digital Image Processing.

Jupyter Notebook 126 30 Updated Apr 21, 2023

AI Image Signal Processing and Computational Photography. Official library for NTIRE (CVPR) and AIM (ICCV/ECCV) Challenges. You will find Learned ISPs, RAW Restoration-Upsampling-Reconstruction, Im…

Jupyter Notebook 457 51 Updated Mar 5, 2025

The easiest way to deploy agents, models, RAG, pipelines and more. No MLOps. No YAML.

Python 3,168 208 Updated May 30, 2025

LLM101n: Let's build a Storyteller

33,521 1,829 Updated Aug 1, 2024

Thunder gives you PyTorch models superpowers for training and inference. Unlock out-of-the-box optimizations for performance, memory and parallelism, or roll out your own.

Python 1,355 96 Updated May 30, 2025

The Museum of Modern Art (MoMA) collection data

1,425 261 Updated May 27, 2025

20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.

Python 12,184 1,237 Updated May 28, 2025

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Python 6,062 520 Updated Sep 6, 2024
0