English | νκ΅μ΄
MCP Client Chatbot is a versatile chat interface that supports various AI model providers like OpenAI, Anthropic, Gemini, and Ollama.
It is also the first known speech-based chatbot with integrated MCP Server support, enabling real-time multimodal interactions.
Our mission is to build the most powerful tool-using chatbot, combining the best of language models and tool orchestration.
We aim to create diverse UX and features that allow LLMs to actively use tools β such as @tool
mentions for direct invocation,
enabling speech-based chat to access and use MCP server tools, quick tool presets for fast selection,
and the upcoming workflow with tools feature for multi-step automation.
Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. Leverage the power of Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.
π Open Source Project MCP Client Chatbot is a 100% community-driven open source project.
- MCP Client Chatbot
Here are some quick examples of how you can use MCP Client Chatbot:
Example: Control a web browser using Microsoft's playwright-mcp tool.
Sample prompt:
Please go to GitHub and visit the cgoinglove profile.
Open the mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.
Quickly call any registered MCP tool during chat by typing @toolname
.
No need to memorize β just type @
and pick from the list!
You can also control how tools are used with the new Tool Choice Mode:
- Auto: Tools are automatically called by the model when needed.
- Manual: The model will ask for your permission before calling any tool.
- None: Disables all tool usage.
Toggle modes anytime with the shortcut βP
.
Add new MCP servers easily through the UI, and start using new tools without restarting the app.
Test MCP tools independently of chat sessions to simplify development and debugging.
Visualize chatbot responses as pie, bar, or line charts using the built-in tool β perfect for quick data insight during conversations.
- π» 100% Local Execution: Run directly on your PC or server without complex deployment, fully utilizing and controlling your computing resources.
- π€ Multiple AI Model Support: Flexibly switch between providers like OpenAI, Anthropic, Google AI, and Ollama.
- π£οΈ Real-time voice chat powered by MCP Server: Currently supports OpenAI provider (Gemini support coming soon)
- π οΈ Powerful MCP Integration: Seamlessly connect external tools (browser automation, database operations, etc.) into chat via Model Context Protocol.
- π Standalone Tool Tester: Test and debug MCP tools separately from the main chat interface.
- π¬ Intuitive Mentions + Tool Control: Trigger tools with
@
, and control when they're used viaAuto
/Manual
/None
modes. - βοΈ Easy Server Setup: Configure MCP connections via UI or
.mcp-config.json
file. - π Markdown UI: Communicate in a clean, readable markdown-based interface.
- π§© Custom MCP Server Support: Modify the built-in MCP server logic or create your own.
- π Built-in Chart Tools: Generate pie, bar, and line charts directly in chat with natural prompts.
- π«Easy Deployment: with vercel support baked in it makes an easily accesible chatbot.
- πRun anywhere: Easily launch with Docker Composeβjust build the image and run.
- π΄i8n suppport: We have Korean and English languages making the chatbot accessible to all.
This project uses pnpm as the recommended package manager.
# If you don't have pnpm:
npm install -g pnpm
# 1. Install dependencies
pnpm i
# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.
# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg
# 4. Run database migrations
pnpm db:migrate
# 5. Start the development server
pnpm dev
# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings
# 1. Install dependencies
pnpm i
# 2. Create environment variable files and fill in the required values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.
# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up
Open http://localhost:3000 in your browser to get started.
The pnpm i
command generates a .env
file. Add your API keys there.
# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api
# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****
# URL for Better Auth (the URL you access the app from)
BETTER_AUTH_URL=
# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name
# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false
# === OAuth Settings (Optional) ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=
You can connect MCP tools via:
- UI Setup: Go to http://localhost:3000/mcp and configure through the interface.
- Custom Logic: Edit
./custom-mcp-server/index.ts
to implement your own logic, this also doesn't run on vercel or docker. - File based for local dev: make .mcp-config.json and put your servers in there. Only works in local dev, no docker or vercel env variable required. For example
Here are some practical tips and guides for using MCP Client Chatbot:
Learn how to set up docker.
Learn how to set up vercel.
Learn how to configure Google and GitHub OAuth for login functionality.
Learn how to integrate system instructions and structures with MCP servers to build an agent that assists with GitHub-based project management.
MCP Client Chatbot is evolving with these upcoming features:
- Self Hosting: β
- Easy deployment with Docker Compose β
- Vercel deployment support (MCP Server: SSE only)β
- File Attach & Image Generation:
- File upload and image generation
- Multimodal conversation support
- MCP Flow:
- Workflow automation with MCP Server integration
- Default Tools for Chatbot:
- Collaborative document editing (like OpenAI Canvas: user & assistant co-editing)
- RAG (Retrieval-Augmented Generation)
- Useful built-in tools for chatbot UX (usable without MCP)
- LLM-powered code writing and editing using Daytona integration
- Seamless LLM-powered code writing, editing, and execution in a cloud development environment via Daytona integration. Instantly generate, modify, and run code with AI assistanceβno local setup required.
π‘ If you have suggestions or need specific features, please create an issue!
We welcome all contributions! Bug reports, feature ideas, code improvements β everything helps us build the best local AI assistant.
For detailed contribution guidelines, please see our Contributing Guide.
Language Translations: Help us make the chatbot accessible to more users by adding new language translations. See language.md for instructions on how to contribute translations.
Let's build it together π
Connect with the community, ask questions, and get support on our official Discord server!