-
Microsoft
Highlights
- Pro
Stars
Docker containers for running training scripts on AzureML
Datasets, tools, and benchmarks for representation learning of code.
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
Ongoing research training transformer models at scale
XLNet: Generalized Autoregressive Pretraining for Language Understanding
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"
Utilities used by the Deep Program Understanding team
Visualizer for neural network, deep learning and machine learning models
Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
TensorFlow code for the neural network presented in the paper: "code2vec: Learning Distributed Representations of Code"
Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
Spark MLlib wrapper for the Snowball framework
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
GAN Playground - Experiment with Generative Adversarial Nets in your browser. An introduction to GANs.
Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
Optimized primitives for collective multi-GPU communication
Language Interoperability
NVIDIA / caffe
Forked from BVLC/caffeCaffe: a fast open framework for deep learning.
Distributed Deep Learning, with a focus on distributed training, using Keras and Apache Spark.
Jupyter notebook-post of advanced numpy techniques
Spark-based approximate nearest neighbor search using locality-sensitive hashing