Lists (1)
Sort Name ascending (A-Z)
Stars
OctoTools: An agentic framework with extensible tools for complex reasoning
[Official Repo] Visual Mamba: A Survey and New Outlooks
🔥🔥🔥Latest Papers, Codes and Datasets on Vid-LLMs.
Collection of AWESOME vision-language models for vision tasks
[CVPR 2024] Alpha-CLIP: A CLIP Model Focusing on Wherever You Want
CLIP Itself is a Strong Fine-tuner: Achieving 85.7% and 88.0% Top-1 Accuracy with ViT-B and ViT-L on ImageNet
A multi-modal CLIP model trained on the medical dataset ROCO
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Fine tuning OpenAI's CLIP model on Indian Fashion Dataset
Codebase for Merging Language Models (ICML 2024)
The Truth Is In There: Improving Reasoning in Language Models with Layer-Selective Rank Reduction
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
A script for training the ConvNextV2 on CIFAR10 dataset using the FSDP technique for a distributed training scheme.
a pytorch implement of mobileNet v2 on cifar10
Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
✨✨Latest Advances on Multimodal Large Language Models
This repository contains the code and datasets for our ICCV-W paper 'Enhancing CLIP with GPT-4: Harnessing Visual Descriptions as Prompts'
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support Llama-3/3.1, Llama-2, LLaMA, BLOOM, Vicuna, Baichuan, TinyLlama, etc.
Awesome Incremental Learning
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A curated list of neural network pruning resources.
Code for Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights