Stars
PyTorch implementation of IndRNN with custom C++/CUDA extensions
Pytorch library for fast transformer implementations
Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
VIP cheatsheets for Stanford's CS 229 Machine Learning
⛽️「算法通关手册」:超详细的「算法与数据结构」基础讲解教程,从零基础开始学习算法知识,850+ 道「LeetCode 题目」详细解析,200 道「大厂面试热门题目」。
PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
The entmax mapping and its loss, a family of sparse softmax alternatives.
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
A concise but complete full-attention transformer with a set of promising experimental features from various papers
A collection of diamond collectors for slurm.
Python package for multivariate hypothesis testing
Transformer and MultiTransformer layers for stock volatility forecasting purposes
Merlion: A Machine Learning Framework for Time Series Intelligence
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
This is the project for deep learning in stock market prediction.
Code to accompany our paper Chen and Zimmermann (2020), "Open source cross-sectional asset pricing"
TorchBench is a collection of open source benchmarks used to evaluate PyTorch performance.
A simple PyTorch RNN package including a general RNN frame
This repository contains the exercises and its solution contained in the book "An Introduction to Statistical Learning" in python.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Genetic Programming in Python, with a scikit-learn inspired API
A game theoretic approach to explain the output of any machine learning model.