This is a Go application built with hexagonal architecture that provides a web UI for interacting with LLM models. It uses langchaingo for LLM integration and connects to an Ollama model.
- Hexagonal architecture for clean separation of concerns
- Structured logging using Go's slog package
- Integration with Ollama LLM models via langchaingo
- Modern web UI for chat interactions
- RESTful API for chat operations
- In-memory chat storage (can be extended to persistent storage)
The application follows the hexagonal architecture pattern:
- Core domain: Contains the business logic, domain models, and interfaces (ports)
- Adapters: Implements the interfaces defined in the core
- Primary adapters: Handle incoming requests (HTTP)
- Secondary adapters: Connect to external systems (LLM, repository)
The application can be configured through a JSON file or environment variables. Default configuration:
{
"server": {
"port": 8080
},
"llm": {
"provider": "ollama",
"ollama": {
"enabled": true,
"endpoint": "http://192.168.1.222:11434",
"model": "gemma3:1b",
"max_tokens": 256,
"timeout_seconds": 100
}
}
}
- Ensure you have Go 1.22+ installed
- Clone the repository
- Run the application:
go run ./cmd/server/main.go
To run with custom configuration:
go run ./cmd/server/main.go -config=/path/to/config.json
To run with debug logging:
go run ./cmd/server/main.go -debug
GET /api/chats
- List all chatsPOST /api/chats
- Create a new chatGET /api/chats/{chatID}
- Get a specific chatPOST /api/chats/{chatID}/messages
- Send a message to a chatDELETE /api/chats/{chatID}
- Delete a chatGET /api/model
- Get information about the current LLM model
The application provides a web UI accessible at:
GET /
- Home page with list of chatsGET /chat/{chatID}
- Chat interface for a specific chat
- Chi router for HTTP routing
- langchaingo for LLM integration
MIT