LM-Proxy is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.
Development Status: bookmark it and go away, it is still in early development.
- @todo
# Install LM-Proxy via pip
pip install lm-proxy
We ❤️ contributions! See CONTRIBUTING.md.
Licensed under the MIT License.
© 2022—2025 Vitalii Stepanenko