An all-in-one Docker Compose config for providing access to local and external LLMs with multiple chat interfaces.
- Caddy: Acts as central entrypoint for the whole stack
- Ollama: Provides access to local LLM models
- LiteLLM: OpenAI compatible API proxy for local Ollama provided models and upstream models
- Multiple ChatGPT-style web interfaces for interacting with the LLM models
- Local (via Ollama)
- local-google-gemma3
- local-llama3-8b
- OpenAI
- openai-gpt-4-turbo
- openai-gpt-4o
- openai-o3
- openai-o4-mini
- Google
- google-gemini-1.5-pro
- Anthropic
- anthropic-claude-3-7-sonnet
- anthropic-claude-3-5-sonnet
- Groq
- groq-llama-3.3-70b
- groq-llama-4-maverick
- Docker
- Docker Compose
- Git
- Clone this repository
- Copy the default config
cp env.default .env
andcp anythingllm.env.default .anythingllm.env
- Edit
.env
and add the relevant API keys - Start the Docker Compose configuration:
docker-compose up
- Access the Caddy webserver at
http://localhost:3000