专注机器学习、深度学习技术原理分享,写作8年著有《跟我一起学机器学习》、《跟我一起学深度学习》、《This Post Is All You Need》(全网阅读300万+)等作品。感兴趣可添加掌柜微信:nulls8
Stars
A playbook for systematically maximizing the performance of deep learning models.
Chinese version of GPT2 training code, using BERT tokenizer.
Code for the paper "Language Models are Unsupervised Multitask Learners"
An implementation of the BERT model and its related downstream tasks based on the PyTorch framework. @月来客栈
A repository contains more than 12 common statistical machine learning algorithm implementations. 常见10余种机器学习算法原理与实现及视频讲解。@月来客栈 出品