Use natural language to control your tools, apps, and services โ connect once, command everything.
Global (npm)
npm install -g @truffle-ai/saiki
Build & Link from source
git clone https://github.com/truffle-ai/saiki.git
cd saiki
npm install
npm run build
npm link
After linking, the saiki
command becomes available globally.
Invoke the interactive CLI:
saiki
Alternative: without global install
You can also run directly via npm:
npm start
Serve the experimental web interface:
saiki --mode web
Alternative: without global install
npm start -- --mode web
Open http://localhost:3000 in your browser.
Run Saiki as a Discord or Telegram bot.
Discord Bot:
saiki --mode discord
Make sure you have DISCORD_BOT_TOKEN
set in your environment. See here for more details.
Telegram Bot:
saiki --mode telegram
Make sure you have TELEGRAM_BOT_TOKEN
set in your environment. See here for more details.
Saiki is an open, modular and extensible AI agent that lets you perform tasks across your tools, apps, and services using natural language. You describe what you want to do โ Saiki figures out which tools to invoke and orchestrates them seamlessly, whether that means running a shell command, summarizing a webpage, or calling an API.
Why developers choose Saiki:
- Open & Extensible: Connect to any service via the Model Context Protocol (MCP).
- Config-Driven Agents: Define & save your agent prompts, tools (via MCP), and model in YAML.
- Multi-Interface Support: Use via CLI, wrap it in a web UI, or integrate into other systems.
- Runs Anywhere: Local-first runtime with logging, retries, and support for any LLM provider.
- Interoperable: Expose as an API or connect to other agents via MCP/A2A(soon).
Saiki is the missing natural language layer across your stack. Whether you're automating workflows, building agents, or prototyping new ideas, Saiki gives you the tools to move fast โ and bend it to your needs. Interact with Saiki via the command line or the new experimental web UI.
Ready to jump in? Follow the Installation guide or explore demos below.
Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?
# Use default config which supports puppeteer for navigating the browser
saiki
Spin up new agents out-of-the-box and use them to power AI NPCs in your game environment. You can configure these agents to go beyond simple LLMs responses to take real actions in-game.
Example project repo coming soon...
Task: Summarize emails and send highlights to Slack
saiki --config-file ./configuration/examples/email_slack.yml
saiki --config-file ./configuration/examples/notion.yml #Requires setup
The saiki
command supports several options to customize its behavior. Run saiki --help
for the full list.
> saiki -h
17:51:31 INFO: Log level set to: INFO
Usage: saiki [options] [prompt...]
AI-powered CLI and WebUI for interacting with MCP servers
Arguments:
prompt Optional headless prompt for single command mode
Options:
-c, --config-file <path> Path to config file (default: "configuration/saiki.yml")
-s, --strict Require all server connections to succeed
--no-verbose Disable verbose output
--mode <mode> Run mode: cli, web, discord, or telegram (default: "cli")
--web-port <port> Port for WebUI (default: "3000")
-m, --model <model> Specify the LLM model to use
-r, --router <router> Specify the LLM router to use (vercel or in-built)
-V, --version output the version number
Common Examples:
-
Specify a custom configuration file:
cp configuration/saiki.yml configuration/custom_config.yml saiki --config-file configuration/custom_config.yml
-
Use a specific AI model (if configured):
saiki -m gemini-2.5-pro-exp-03-25
Saiki defines agents using a YAML config file (configuration/saiki.yml
by default). To configure an agent, use tool servers (MCP servers) and LLM providers.
mcpServers:
filesystem:
type: stdio
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- .
puppeteer:
type: stdio
command: npx
args:
- -y
- "@truffle-ai/puppeteer-server"
llm:
provider: openai
model: gpt-4.1-mini
apiKey: $OPENAI_API_KEY
Saiki communicates with your tools via Model Context Protocol (MCP) servers. You can discover and connect to MCP servers in several ways:
-
Browse pre-built servers:
- Model Context Protocol reference servers: https://github.com/modelcontextprotocol/reference-servers
- Smithery.ai catalog: https://smithery.ai/tools
- Composio MCP registry: https://mcp.composio.dev/
-
Search on npm:
npm search @modelcontextprotocol/server
-
Add servers to your
configuration/saiki.yml
under themcpServers
key (see the snippet above). -
Create custom servers:
- Use the MCP TypeScript SDK: https://github.com/modelcontextprotocol/typescript-sdk
- Follow the MCP spec: https://modelcontextprotocol.io/introduction
Saiki is designed to be a flexible component in your AI and automation workflows. Beyond the CLI and Web UI, you can integrate Saiki's core agent capabilities into your own applications or have it communicate with other AI agents.
When Saiki runs in web
mode (saiki --mode web
), it exposes a comprehensive REST API and a WebSocket interface, allowing you to control and interact with the agent programmatically. This is ideal for building custom front-ends, backend integrations, or embedding Saiki into existing platforms.
For detailed information on the available API endpoints and WebSocket communication protocol, please see the Saiki API and WebSocket Interface documentation.
Saiki embraces the Model Context Protocol (MCP) not just for connecting to tools, but also for Agent-to-Agent communication. This means Saiki can:
- Act as an MCP Client: Connect to other AI agents that expose an MCP server interface, allowing Saiki to delegate tasks or query other agents as if they were any other tool.
- Act as an MCP Server: Saiki itself exposes an MCP server interface (see
src/app/api/mcp_handler.ts
andsrc/app/api/a2a.ts
). This makes Saiki discoverable and usable by other MCP-compatible agents or systems. Another agent could connect to Saiki and utilize its configured tools and LLM capabilities.
This framework-agnostic approach allows Saiki to participate in a broader ecosystem of AI agents, regardless of their underlying implementation. By defining an AgentCard
(a standardized metadata file, based on A2A protocol, describing an agent's capabilities and MCP endpoint), Saiki can be discovered and interact with other agents seamlessly. (We also plan to integrate support for the A2A protocol soon)
This powerful A2A capability opens up possibilities for creating sophisticated multi-agent systems where different specialized agents collaborate to achieve complex goals.
Find detailed guides, architecture, and API reference in the docs/
folder:
We welcome contributions! Refer here for more details.
Saiki was built by the team at Truffle AI.
Saiki is better with you! Join our Discord whether you want to say hello, share your projects, ask questions, or get help setting things up:
If you're enjoying Saiki, please give us a โญ on GitHub!
Elastic License 2.0. See LICENSE for details.
Thanks to all these amazing people for contributing to Saiki! (full list):