Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Jun 15, 2025 - Python
8000
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
Open source multi-modal RAG for building AI apps over private knowledge.
A model-driven approach to building AI agents in just a few lines of code.
Evaluate your LLM's response with Prometheus and GPT4 💯
A website where you can compare every AI Model ✨
An example agent demonstrating streaming, tool use, and interactivity from your terminal. This agent builder can help you to build your own agents and tools.
A set of tools that gives agents powerful capabilities.
Agent samples built using the Strands Agents SDK.
Command-line personal assistant using your favorite proprietary or local models with access to over 30+ tools
This MCP server provides documentation about Strands Agents to your GenAI tools, so you can use your favorite AI coding assistant to vibe-code Strands Agents.
A Discord bot for large language models. Add Gemini 2.5 Pro, Claude Sonnet 3.7, GPT 4.1, and other models. Easily change models, edit prompts, and enable web search.
Power up your data science workflow with ChatGPT.
Documentation for the Strands Agents SDK. A model-driven approach to building AI agents in just a few lines of code.
🚅 LiteLLM Proxy for Google Cloud Generative AI
Add a description, image, and links to the litellm topic page so that developers can more easily learn about it.
To associate your repository with the litellm topic, visit your repo's landing page and select "manage topics."