A comprehensive FastAPI-based AI backend service with RAG capabilities, document processing, and chat functionality.
# Start the services
docker-compose up -d
# Check service health
curl http://localhost:8001/health
# Access OpenWebUI
http://localhost:3000
All documentation files are located in the /readme
folder:
- Backend API:
http://localhost:8001
- OpenWebUI:
http://localhost:3000
- Redis:
localhost:6379
- ChromaDB:
http://localhost:8000
- 🤖 Chat completions with multiple LLM models
- 📄 Document upload and RAG processing
- 💾 Redis caching and session management
- 🔍 Semantic search with ChromaDB
- 🛡️ API key authentication
- 📊 Health monitoring and metrics
- 🔧 AI tools integration (weather, time, units)
The codebase is organized into modular components:
core/
- Core functionality (database, schemas, error handling)managers/
- Business logic managersrouters/
- API endpointsutils/
- Utility functionsconfig/
- Configuration filesscripts/
- Deployment scripts
✅ Refactoring Complete - Modular architecture implemented
🔧 In Progress - Functionality fixes for authentication and chat endpoints