- Amsterdam, the Netherlands
Highlights
- Pro
Stars
An extensible benchmark for evaluating large language models on planning
Must-read Papers on Knowledge Editing for Large Language Models.
2-2000x faster ML algos, 50% less memory usage, works on all hardware - new and old.
IVON optimizer for neural networks based on variational learning.
Code for "Counterfactual Token Generation in Large Language Models", Arxiv 2024.
A codebase that makes differentially private training of transformers easy.
A Python package to assess and improve fairness of machine learning models.
Any model. Any hardware. Zero compromise. Built with @ziglang / @openxla / MLIR / @bazelbuild
python implementation of Peng Ding's "First Course in Causal Inference"
OpenXAI : Towards a Transparent Evaluation of Model Explanations
Language model alignment-focused deep learning curriculum
Pen and paper exercises in machine learning
A list of awesome papers and cool resources on optimal transport and its applications in general! As you will notice, this list is currently mostly focused on optimal transport for machine learning…
A simple and extensible library to create Bayesian Neural Network layers on PyTorch.
The code in this repository follows the paper "Stochastic gradient MCMC"
A modular framework for neural networks with Euclidean symmetry
[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
a gaggle of deep neural architectures for text ranking and question answering, designed for Pyserini
T81-558: Keras - Applications of Deep Neural Networks @Washington University in St. Louis
Train deepGLM with Matlab, R and Python
A tool for holistic analysis of language generations systems
Code for reproducing numerical results in the Waste-free Sequential Monte Carlo article.