A chat interface for Ollama with persistent memory using ChromaDB vector database.
- Semantic search for conversation history
- Persistent memory across sessions
- Vector-based storage for better context retrieval
- Session management
- Error handling and graceful recovery
- Ollama installed and running
- Python 3.8+
- The mistral-nemo:latest model pulled in Ollama
- Install the required packages:
pip install -r requirements.txt
- Make sure Ollama is running and the model is installed:
ollama pull mistral-nemo:latest
- Run the chat interface:
python main.py
- Available commands:
- Type 'quit' to exit
- Type 'new' to start a new session
- Any other input will be sent as a message to the chat
- Uses ChromaDB for vector-based storage of conversation history
- Semantically searches for relevant context from previous conversations
- Maintains persistent memory across different chat sessions
- Automatically manages conversation context window
main.py
- Entry point and CLI interfacechat_interface.py
- Main chat logic and Ollama integrationvector_store.py
- ChromaDB vector database operationsrequirements.txt
- Python dependencieschroma_db/
- Directory where ChromaDB stores its data (created automatically)