English | νκ΅μ΄
MCP Client Chatbot is a versatile chat interface that supports various AI model providers like OpenAI, Anthropic, Google, and Ollama. It is designed for instant execution in 100% local environments without complex configuration, enabling users to fully control computing resources on their personal computer or server.
Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. Leverage the power of Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.
Our goal: Build an AI chatbot app that is optimized for personal use and easy for anyone to run.
Tool Integration Example: Demonstrates browser control using Microsoft's playwright-mcp.
Prompt Example: "Go to Reddit, open r/mcp, check the latest post and tell me what it's about β then close Reddit."
This project comes pre-configured with microsoft/playwright-mcp as a default MCP server. Try running the prompt above to see it in action!
Quick Tool Access: Use the @
symbol in the message input to quickly select and call available MCP tools.
Standalone Tool Testing: Test MCP tools independently of the chat flow for easier development and debugging.
Model & Tool Selection UI: Easily switch LLM providers and view tool status directly within the prompt input panel.
- π» 100% Local Execution: Run directly on your PC or server without complex deployment, fully utilizing and controlling your computing resources.
- π€ Multiple AI Model Support: Flexibly switch between providers like OpenAI, Anthropic, Google AI, and Ollama.
- π οΈ Powerful MCP Integration: Seamlessly connect external tools (browser automation, database operations, etc.) into chat via Model Context Protocol.
- π Standalone Tool Tester: Test and debug MCP tools separately from the main chat interface.
- π¬ Intuitive Mentions: Trigger available tools with
@
in the input field. - βοΈ Easy Server Setup: Configure MCP connections via UI or
.mcp-config.json
file. - π Markdown UI: Communicate in a clean, readable markdown-based interface.
- πΎ Zero-Setup Local DB: Uses SQLite by default for local storage (PostgreSQL also supported).
- π§© Custom MCP Server Support: Modify the built-in MCP server logic or create your own.
This project uses pnpm as the recommended package manager.
# 1. Install dependencies
pnpm i
# 2. Initialize project (creates .env, sets up DB)
pnpm initial
# 3. Start dev server
pnpm dev
Visit http://localhost:3000 after starting the server.
The pnpm initial
command generates a .env
file. Add your API keys there:
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
# ANTHROPIC_API_KEY=****
SQLite is the default DB (db.sqlite
). To use PostgreSQL, set USE_FILE_SYSTEM_DB=false
and define DATABASE_URL
in .env
.
You can connect MCP tools via:
- UI Setup: Go to http://localhost:3000/mcp and configure through the interface.
- Direct File Edit: Modify
.mcp-config.json
in project root. - Custom Logic: Edit
./custom-mcp-server/index.ts
to implement your own logic.
- Supabase Integration: Use MCP to manage Supabase DB, auth, and real-time features.
We're making MCP Client Chatbot even more powerful with these planned features:
- π¨ Canvas Mode: Real-time editing interface for LLM + user collaboration (e.g. code, blogs).
- π§© LLM UI Generation: Let LLMs render charts, tables, forms dynamically.
- π Rule Engine: Persistent system prompt/rules across the session.
- πΌοΈ Image & File Uploads: Multimodal interaction via uploads and image generation.
- π GitHub Mounting: Mount local GitHub repos to ask questions and work on code.
- π RAG Agent: Retrieval-Augmented Generation using your own documents.
- π§ Planning Agent: Smarter agent that plans and executes complex tasks.
- π§βπ» Agent Builder: Tool to create custom AI agents for specific goals.
π See full roadmap in ROADMAP.md
We welcome all contributions! Bug reports, feature ideas, code improvements β everything helps us build the best local AI assistant.
Letβs build it together π