This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
-
Updated
Jul 1, 2024 - Python
8000
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
Implementation of PSGD optimizer in JAX
JAX implementations of various deep reinforcement learning algorithms.
Tensor Networks for Machine Learning
Goal-conditioned reinforcement learning like 🔥
JAX/Flax implementation of finite-size scaling
An implementation of adan optimizer for optax
Training methodologies for autoregressive neural operators/emulators in JAX.
H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
A Simplistic trainer for Flax
JAX implementation of Classical and Quantum Algorithms for Orthogonal Neural Networks by (Kerenidis et al., 2021)
Variational Graph Autoencoder implemented using Jax & Jraph
An Optax-based JAX implementation of the IVON optimizer for large-scale VI training of NNs (ICML'24 spotlight)
dm-haiku implementation of hyperbolic neural networks
Direct port of TD3_BC to JAX using Haiku and optax.
The (unofficial) vanilla version of WaveRNN
Oxford MSc thesis. PriorVAE with graph convolutional networks for learning locally-aware spatial prior distributions
Add a description, image, and links to the optax topic page so that developers can more easily learn about it.
To associate your repository with the optax topic, visit your repo's landing page and select "manage topics."