-
Vanderbilt University
- United States
Highlights
- Pro
Lists (2)
Sort Name ascending (A-Z)
Stars
Industry leading face manipulation platform
Inspecting and Editing Knowledge Representations in Language Models
PaCE: Parsimonious Concept Engineering for Large Language Models (NeurIPS 2024)
TensorZero creates a feedback loop for optimizing LLM applications — turning production data into smarter, faster, and cheaper models.
Visual Language Transformer Interpreter - An interactive visualization tool for interpreting vision-language transformers
Asynchronous FastAPI wrapper for AsyncOpenAI and OpenAI assistantAPI Resources
Stanford NLP Python library for Representation Finetuning (ReFT)
Progressive Growing of GANs for Improved Quality, Stability, and Variation
Elucidating the Design Space of Diffusion-Based Generative Models (EDM)
official repo for Asyrp : Diffusion Models already have a Semantic Latent Space (ICLR2023)
Inofficial implementation of "Discovering Interpretable Directions in the Semantic Latent Space of Diffusion Models"
Improving neural network representations using human similarity judgments
Github Pages template based upon HTML and Markdown for personal, portfolio-based websites.
This repo is meant to serve as a guide for Machine Learning/AI technical interviews.
Official implementation of project NoiseCLR, published at CVPR 2024
An open source implementation of CLIP.
Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch
Stanford NLP Python library for understanding and improving PyTorch models via interventions
Evaluate interpretability methods on localizing and disentangling concepts in LLMs.
graphpatch is a library for activation patching on PyTorch neural network models.
Transformer Explained Visually: Learn How LLM Transformer Models Work with Interactive Visualization
Function Vectors in Large Language Models (ICLR 2024)
This repository contains the data and code of the paper titled "IllusionVQA: A Challenging Optical Illusion Dataset for Vision Language Models"
Code for paper "G-Eval: NLG Evaluation using GPT-4 with Better Human Alignment"
Spearman's rank correlation coefficient in NodeJS
Pre-trained models and code and data to train and use models from "Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations"
Official code for ICML 2024 paper on Persona In-Context Learning (PICLe)