Stars
Quantus is an eXplainable AI toolkit for responsible evaluation of neural network explanations
Spectraformer a unified random feature framework for transformer for approximating and learning the kernel function in linearized attention of the Transformer
Source Code of the ROAD benchmark for feature attribution methods (ICML22)
[ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention
Anonymous Github is a proxy server to support anonymous browsing of Github repositories for open-science code and data.
Python code of Hilbert-Schmidt Independence Criterion
No fortress, purely open ground. OpenManus is Coming.
This is the homepage of a new book entitled "Mathematical Foundations of Reinforcement Learning."
MinRL provides clean, minimal implementations of fundamental reinforcement learning algorithms in a customizable GridWorld environment. The project focuses on educational clarity and implementation…
Implementation of Reinforcement Learning Algorithms. Python, OpenAI Gym, Tensorflow. Exercises and Solutions to accompany Sutton's Book and David Silver's course.
Models and examples built with TensorFlow
Understanding Deep Networks via Extremal Perturbations and Smooth Masks
Model interpretability and understanding for PyTorch
👋 Xplique is a Neural Networks Explainability Toolbox
Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.
Skyformer: Remodel Self-Attention with Gaussian Kernel and Nystr\"om Method (NeurIPS 2021)
An implementation of Performer, a linear attention-based transformer, in Pytorch
nyu-mll / spinn
Forked from stanfordnlp/spinnNYU ML² work on sentence encoding with tree structure and dynamic graphs
Transformer: PyTorch Implementation of "Attention Is All You Need"
This repository contains PyTorch implementations of various random feature maps for dot product kernels.
✔(已完结)最全面的 深度学习 笔记【土堆 Pytorch】【李沐 动手学深度学习】【吴恩达 深度学习】
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch