A simple autograd engine I built to learn about automatic differentiation and neural networks from scratch. This is my personal learning project to understand how frameworks like PyTorch work under the hood.
The core class that handles automatic differentiation:
# Simple example of how it works
x = Val(2.0)
y = Val(3.0)
z = x * y + x.tanh()
z.backward() # Computes all gradients
Basic neural network implementation:
# Creating a small network
model = MLP(nin=2, nouts=[16, 16, 1]) # 2 inputs, 2 hidden layers, 1 output
Tests to make sure everything works:
# Testing gradients
x = Val(2.0)
y = x * x
y.backward()
assert x.grad == 4.0 # dy/dx = 2x at x=2
Just clone and run the tests:
git clone https://github.com/yourusername/reigrad.git
cd reigrad
python test_engine.py
These resources helped me understand the concepts:
- Andrej Karpathy's micrograd
- PyTorch documentation
- Various autograd tutorials
built with ❤️ by ctxnn