A customizable, general purpose AI Agent that supports MCP. Talk to Saiki in natural language to control computers, applications and more!
Use natural language to control your tools, apps, and services — connect once, command everything.
Global (npm)
npm install -g @truffle-ai/saiki
git clone https://github.com/truffle-ai/saiki.git
cd saiki
npm install
npm run build
npm link
After linking, the saiki
command becomes available globally.
Invoke the interactive CLI:
saiki
You can also run directly via npm:
npm start
Serve the experimental web interface:
saiki --mode web
npm start -- --mode web
Open http://localhost:3000 in your browser.
Run Saiki as a server with just REST APIs and WebSockets:
saiki --mode server
This mode is perfect for:
The server exposes REST endpoints for messaging, MCP server management, and WebSocket support for real-time communication.
Run Saiki as a Discord or Telegram bot.
Discord Bot:
saiki --mode discord
Make sure you have DISCORD_BOT_TOKEN
set in your environment. See here for more details.
Telegram Bot:
saiki --mode telegram
Make sure you have TELEGRAM_BOT_TOKEN
set in your environment. See here for more details.
Spin up an agent that acts as an MCP server
saiki --mode mcp
Saiki is an open, modular and extensible AI agent that lets you perform tasks across your tools, apps, and services using natural language. You describe what you want to do — Saiki figures out which tools to invoke and orchestrates them seamlessly, whether that means running a shell command, summarizing a webpage, or calling an API.
Why developers choose Saiki:
Saiki is the missing natural language layer across your stack. Whether you're automating workflows, building agents, or prototyping new ideas, Saiki gives you the tools to move fast — and bend it to your needs. Interact with Saiki via the command line or the new experimental web UI.
Ready to jump in? Follow the Installation guide or explore demos below.
Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?
# Use default config which supports puppeteer for navigating the browser
saiki
Task: Summarize emails and send highlights to Slack
saiki --agent ./agents/examples/email_slack.yml
saiki --agent ./agents/examples/notion.yml #Requires setup
The saiki
command supports several options to customize its behavior. Run saiki --help
for the full list.
> saiki -h
Usage: saiki [options] [command] [prompt...]
Saiki CLI allows you to talk to Saiki, build custom AI Agents, build complex AI applications like Cursor, and more.
Run saiki interactive CLI with `saiki` or run a one-shot prompt with `saiki <prompt>`
Run saiki web UI with `saiki --mode web`
Run saiki as a server (REST APIs + WebSockets) with `saiki --mode server`
Run saiki as a discord bot with `saiki --mode discord`
Run saiki as a telegram bot with `saiki --mode telegram`
Run saiki as an MCP server with `saiki --mode mcp`
Check subcommands for more features. Check https://github.com/truffle-ai/saiki for documentation on how to customize saiki and other examples
Arguments:
prompt Natural-language prompt to run once. If not passed, saiki will start as an interactive CLI
Options:
-v, --version output the current version
-a, --agent <path> Path to agent config file (default: "agents/agent.yml")
-s, --strict Require all server connections to succeed
--no-verbose Disable verbose output
-m, --model <model> Specify the LLM model to use.
-r, --router <router> Specify the LLM router to use (vercel or in-built)
--mode <mode> The application in which saiki should talk to you - cli | web | server | discord | telegram | mcp (default: "cli")
--web-port <port> optional port for the web UI (default: "3000")
-h, --help display help for command
Commands:
create-app Scaffold a new Saiki Typescript app
init-app Initialize an existing Typescript app with Saiki
Common Examples:
Specify a custom agent:
cp agents/agent.yml agents/custom_config.yml
saiki --agent agents/custom_config.yml
Use a specific AI model (if configured):
saiki -m gemini-2.5-pro-exp-03-25
Saiki defines agents using a YAML config file (agents/agent.yml
by default). To configure an agent, use tool servers (MCP servers) and LLM providers.
mcpServers:
filesystem:
type: stdio
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- .
puppeteer:
type: stdio
command: npx
args:
- -y
- "@truffle-ai/puppeteer-server"
llm:
provider: openai
model: gpt-4.1-mini
apiKey: $OPENAI_API_KEY
Saiki supports multiple LLM providers out of the box, plus any OpenAI SDK-compatible provider.
gpt-4.1-mini
, gpt-4o
, o3
, o1
and moreclaude-3-7-sonnet-20250219
, claude-3-5-sonnet-20240620
and moregemini-2.5-pro-exp-03-25
, gemini-2.0-flash
and morellama-3.3-70b-versatile
, gemma-2-9b-it
You will need to set your provider specific API keys accordingly.
Set your API key and run:
# OpenAI (default)
export OPENAI_API_KEY=your_key
saiki
# Switch providers via CLI
saiki -m claude-3-5-sonnet-20240620
saiki -m gemini-2.0-flash
For comprehensive setup instructions, all supported models, advanced configuration, and troubleshooting, see our LLM Providers Guide.
Saiki can be easily integrated into your applications as a powerful AI agent library. Here's a simple example to get you started:
import 'dotenv/config';
import { loadConfigFile, SaikiAgent } from '@truffle-ai/saiki';
// Load your agent configuration
const config = await loadConfigFile('./agent.yml');
const agent = new SaikiAgent(config);
// Start the agent (initialize async services)
await agent.start();
// Use the agent for single tasks
const result = await agent.run("Analyze the files in this directory and create a summary");
console.log(result);
// Clean shutdown when done
await agent.stop();
// Or have conversations
const response1 = await agent.run("What files are in the current directory?");
const response2 = await agent.run("Create a README for the main.py file");
// Reset conversation when needed
agent.resetConversation();
For detailed information on the available API endpoints and WebSocket communication protocol, please see the Saiki API and WebSocket Interface documentation.
For comprehensive guides on building different types of applications with Saiki, including:
See our Building with Saiki Developer Guide.
Saiki includes a powerful MCPManager that can be used as a standalone utility for managing MCP servers in your own applications. This is perfect for developers who need MCP server management without the full Saiki agent framework.
import { MCPManager } from '@truffle-ai/saiki';
// Create manager instance
const manager = new MCPManager();
// Connect to MCP servers
await manager.connectServer('filesystem', {
type: 'stdio',
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '.']
});
await manager.connectServer('web-search', {
type: 'stdio',
command: 'npx',
args: ['-y', '[email protected]'],
env: { TAVILY_API_KEY: process.env.TAVILY_API_KEY }
});
// Get all available tools across servers
const tools = await manager.getAllTools();
console.log('Available tools:', Object.keys(tools));
// Execute a tool
const result = await manager.executeTool('readFile', { path: './README.md' });
console.log('File contents:', result);
// List connected servers
const clients = manager.getClients();
console.log('Connected servers:', Array.from(clients.keys()));
// Disconnect when done
await manager.disconnectAll();
The MCPManager provides a simple, unified interface for connecting to and managing multiple MCP servers simultaneously. See our MCP Manager Documentation for complete API reference and advanced usage patterns.
Find detailed guides, architecture, and API reference in our comprehensive documentation:
We welcome contributions! Refer to our Contributing Guide for more details.
Saiki was built by the team at Truffle AI.
Saiki is better with you! Join our Discord whether you want to say hello, share your projects, ask questions, or get help setting things up:
If you're enjoying Saiki, please give us a ⭐ on GitHub!
Elastic License 2.0. See LICENSE for details.
Thanks to all these amazing people for contributing to Saiki! (full list):
{ "mcpServers": { "saiki": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-filesystem", "." ] } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories