Stars
Joint Optimization Framework for Learning with Noisy Labels
Docker container to run PyCharm Community Edition
A curated list of resources for Learning with Noisy Labels
Pytorch implementation of 'TRAINING DEEP NEURAL-NETWORKS USING A NOISE ADAPTATION LAYER' (ICLR2017)
Tips for releasing research code in Machine Learning (with official NeurIPS 2020 recommendations)
Deep learning model converter for PaddlePaddle. (『飞桨』深度学习模型转换工具)
TRAINING DEEP NEURAL-NETWORKS USING A NOISE ADAPTATION LAYER
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
I'm compiling comprehensive coding tutorials for many different languages and frameworks! 🐲
An optimizer that trains as fast as Adam and as good as SGD.
NeurIPS'18: Masking: A New Perspective of Noisy Supervision
NeurIPS'2019: Are Anchor Points Really Indispensable in Label-Noise Learning?
Applied Sparse regularization (L1), Weight decay regularization (L2), ElasticNet, GroupLasso and GroupSparseLasso to Neuronal Network.
🎓 Path to a free self-taught education in Computer Science!
A simple NN in PyTorch for the Medium article.
Experiments in positive-unlabeled learning
A game theoretic approach to explain the output of any machine learning model.
Unsupervised Learning by Predicting Noise
A PyTorch implementation of the Transformer model in "Attention is All You Need".
PyTorch deep learning projects made easy.
Code for paper "Dimensionality-Driven Learning with Noisy Labels" - ICML 2018
Code and models accompanying "Deep Predictive Coding Networks for Video Prediction and Unsupervised Learning"