A tiny autograd engine for learning purposes, inspired by micrograd. Implements reverse-mode autodifferentiation (backpropagation) in pure Python.
- Scalar-valued autograd engine
- Pure Python Tensor class with broadcasting capabilities
- Neural network library with modular design
- No external dependencies for core functionality
- Supports basic arithmetic operations and common activation functions
- Clone the repository:
git clone https://github.com/yourusername/punygrad.git
cd punygrad
- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
from punygrad.engine import Scalar
from punygrad.nn import MLP
# Create a simple neural network
model = MLP(nin=3, nouts=[4, 4, 1])
# Forward pass
x = [2.0, 3.0, -1.0]
output = model(x)
# Backward pass
output.backprop()
from punygrad.engine import Tensor
# Create tensors from nested lists
t1 = Tensor([[1, 2], [3, 4]])
t2 = Tensor([[5], [6]])
# Broadcasting operations
bt1, bt2 = t1.broadcast_with(t2)
# bt1 shape: (2, 2), bt2 shape: (2, 2)
# Convert to Python list for operations
data_list = t1.to_list()
Run tests:
pytest
Format code:
black .
ruff .
mypy .
punygrad/
├── src/
│ └── punygrad/ # Main package
│ ├── __init__.py # Package initialization
│ ├── engine.py # Autograd engine with Scalar and Tensor classes
│ └── nn.py # Neural network components
├── tests/ # Test directory
├── examples/ # Example usage
└── requirements.txt # Project dependencies
The Scalar
class is the core of the autograd engine, supporting:
- Automatic differentiation
- Basic arithmetic operations (+, -, *, /, **)
- Common activation functions (exp, tanh)
The Tensor
class provides multi-dimensional array support:
- Creation from nested lists
- Shape information and manipulation
- Broadcasting capabilities (similar to NumPy/PyTorch)
- Pure Python implementation with no external dependencies
The nn
module provides building blocks for neural networks:
- Modules with parameters
- Linear layers
- Activation functions
- Multi-layer perceptr 58CC ons (MLPs)
MIT License - see LICENSE file for details.
This project is heavily inspired by Andrej Karpathy's micrograd, a tiny scalar-valued autograd engine. punygrad extends the concepts with additional features like the Tensor class with broadcasting capabilities while maintaining the educational focus of the original project.