8000 GitHub - cgoinglove/mcp-client-chatbot at v1.6.1
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

cgoinglove/mcp-client-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

MCP Client Chatbot

English | ν•œκ΅­μ–΄

Local First MCP Supported Discord

Deploy with Vercel

MCP Client Chatbot is a versatile chat interface that supports various AI model providers like OpenAI, Anthropic, Gemini, and Ollama.

It is also the first known speech-based chatbot with integrated MCP Server support, enabling real-time multimodal interactions.

Our mission is to build the most powerful tool-using chatbot, combining the best of language models and tool orchestration.

We aim to create diverse UX and features that allow LLMs to actively use tools β€” such as @tool mentions for direct invocation,
enabling speech-based chat to access and use MCP server tools, quick tool presets for fast selection,
and the upcoming workflow with tools feature for multi-step automation.

Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. Leverage the power of Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.

🌟 Open Source Project MCP Client Chatbot is a 100% community-driven open source project.

Table of Contents


Demo

Here are some quick examples of how you can use MCP Client Chatbot:

🧩 Browser Automation with Playwright MCP

playwright-demo

Example: Control a web browser using Microsoft's playwright-mcp tool.

Sample prompt:

Please go to GitHub and visit the cgoinglove profile.
Open the mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.

⚑️ Quick Tool Mentions (@)

mention

Quickly call any registered MCP tool during chat by typing @toolname.
No need to memorize β€” just type @ and pick from the list!

You can also control how tools are used with the new Tool Choice Mode:

  • Auto: Tools are automatically called by the model when needed.
  • Manual: The model will ask for your permission before calling any tool.
  • None: Disables all tool usage.

Toggle modes anytime with the shortcut ⌘P.


πŸ”Œ Adding MCP Servers Easily

mcp-server-install

Add new MCP servers easily through the UI, and start using new tools without restarting the app.


πŸ› οΈ Standalone Tool Testing

tool-test

Test MCP tools independently of chat sessions to simplify development and debugging.

πŸ“Š Built-in Chart Tools

May-04-2025 01-55-04

Visualize chatbot responses as pie, bar, or line charts using the built-in tool β€” perfect for quick data insight during conversations.


✨ Key Features

  • πŸ’» 100% Local Execution: Run directly on your PC or server without complex deployment, fully utilizing and controlling your computing resources.
  • πŸ€– Multiple AI Model Support: Flexibly switch between providers like OpenAI, Anthropic, Google AI, and Ollama.
  • πŸ—£οΈ Real-time voice chat powered by MCP Server: Currently supports OpenAI provider (Gemini support coming soon)
  • πŸ› οΈ Powerful MCP Integration: Seamlessly connect external tools (browser automation, database operations, etc.) into chat via Model Context Protocol.
  • πŸš€ Standalone Tool Tester: Test and debug MCP tools separately from the main chat interface.
  • πŸ’¬ Intuitive Mentions + Tool Control: Trigger tools with @, and control when they're used via Auto / Manual / None modes.
  • βš™οΈ Easy Server Setup: Configure MCP connections via UI or .mcp-config.json file.
  • πŸ“„ Markdown UI: Communicate in a clean, readable markdown-based interface.
  • 🧩 Custom MCP Server Support: Modify the built-in MCP server logic or create your own.
  • πŸ“Š Built-in Chart Tools: Generate pie, bar, and line charts directly in chat with natural prompts.
  • πŸ›«Easy Deployment: with vercel support baked in it makes an easily accesible chatbot.
  • πŸƒRun anywhere: Easily launch with Docker Composeβ€”just build the image and run.
  • 🏴i8n suppport: We have Korean and English languages making the chatbot accessible to all.

This project uses pnpm as the recommended package manager.

# If you don't have pnpm:
npm install -g pnpm

Quick Start (Local Version) πŸš€

# 1. Install dependencies
pnpm i

# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.

# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg

# 4. Run database migrations
pnpm db:migrate

# 5. Start the development server
pnpm dev

# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings

Quick Start (Docker Compose Version) 🐳

# 1. Install dependencies
pnpm i

# 2. Create environment variable files and fill in the required values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.

# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up

Open http://localhost:3000 in your browser to get started.


Environment Variables

The pnpm i command generates a .env file. Add your API keys there.

# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api

# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****

# URL for Better Auth (the URL you access the app from)
BETTER_AUTH_URL=
# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name

# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false

# === OAuth Settings (Optional) ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=

MCP Server Setup

You can connect MCP tools via:

  1. UI Setup: Go to http://localhost:3000/mcp and configure through the interface.
  2. Custom Logic: Edit ./custom-mcp-server/index.ts to implement your own logic, this also doesn't run on vercel or docker.
  3. File based for local dev: make .mcp-config.json and put your servers in there. Only works in local dev, no docker or vercel env variable required. For example
// .mcp-config.json
{
  "playwright":  {
      "command": "npx",
      "args": ["@playwright/mcp@latest"]
    },
}

πŸ’‘ Tips & Guides

Here are some practical tips and guides for using MCP Client Chatbot:

Learn how to set up docker.

Learn how to set up vercel.

Learn how to configure Google and GitHub OAuth for login functionality.

Learn how to integrate system instructions and structures with MCP servers to build an agent that assists with GitHub-based project management.

πŸ—ΊοΈ Roadmap: Next Features

MCP Client Chatbot is evolving with these upcoming features:

πŸš€ Deployment & Hosting βœ…

  • Self Hosting: βœ…
    • Easy deployment with Docker Compose βœ…
    • Vercel deployment support (MCP Server: SSE only)βœ…

πŸ“Ž File & Image

  • File Attach & Image Generation:
    • File upload and image generation
    • Multimodal conversation support

πŸ”„ MCP Workflow

  • MCP Flow:
    • Workflow automation with MCP Server integration

πŸ› οΈ Built-in Tools & UX

  • Default Tools for Chatbot:
    • Collaborative document editing (like OpenAI Canvas: user & assistant co-editing)
    • RAG (Retrieval-Augmented Generation)
    • Useful built-in tools for chatbot UX (usable without MCP)

πŸ’» LLM Code Write (with Daytona)

  • LLM-powered code writing and editing using Daytona integration
    • Seamless LLM-powered code writing, editing, and execution in a cloud development environment via Daytona integration. Instantly generate, modify, and run code with AI assistanceβ€”no local setup required.

πŸ’‘ If you have suggestions or need specific features, please create an issue!

πŸ™Œ Contributing

We welcome all contributions! Bug reports, feature ideas, code improvements β€” everything helps us build the best local AI assistant.

For detailed contribution guidelines, please see our Contributing Guide.

Language Translations: Help us make the chatbot accessible to more users by adding new language translations. See language.md for instructions on how to contribute translations.

Let's build it together πŸš€

πŸ’¬ Join Our Discord

Discord

Connect with the community, ask questions, and get support on our official Discord server!

0