A sample project demonstrating an AI-powered agents, built with FastAPI and Dash.
- FastAPI backend exposing a
/chat
endpoint for conversational queries via LLM agents. - Dash front-end UI for real-time chat with selectable AI models and agents.
- Research Agent: performs multi-step Google searches and fetches page content.
- Aula Agent: integrates with the Danish Aula school system to fetch profiles, messages, calendar events, etc.
- Configuration via environment variables (
.env
).
- Python 3.13+
- A valid Azure OpenAI key and endpoint (optional)
- Anthropic API credentials (optional)
- Aula login credentials (for
aula_agent
) git
,pip
orpoetry
Please check src/constants.py, and list available models.
-
Clone the repository:
git clone git@github.com:A-Hoier/Aula-AI.d.git cd Aula-AI.d
-
Install dependencies:
uv sync
-
Copy the example environment file and fill in your credentials:
cp .env.example .env
-
Edit
.env
and set all required variables (see Configuration below).
The application uses [pydantic-settings] to load variables from .env
.
-
Frontend
BACKEND_URL
(e.g.http://localhost:8000/
)
docker compose up
uv run uvicorn api:app --reload
- The API will be available at
http://localhost:8000
- Swagger UI docs at
http://localhost:8000/docs
in another terminal:
uv run python app.py
- Opens a local server (default
http://127.0.0.1:8050
) with the chat interface.
- Select an LLM model (e.g.
gpt-4o
) and an agent (research_agent
oraula_agent
). - Type a message and hit Send or enter.