VibeOps - Cisco pyATS MCP Plus Many Other MCPs
MCPyATS brings together the Model Context Protocol (MCP) and Cisco pyATS to build intelligent, distributed agents capable of automating network tasks via natural language.
It includes multiple tool-integrated MCP servers, a LangGraph agent backend, a Streamlit UI frontend, and support for A2A (Agent-to-Agent) communication.
The Model Context Protocol (MCP) from Anthropic is a protocol that offers USB-C like interoperability between MCP Servers, where tools are exposed, and MCP Clients, where the agent is running. It allows for a standardized way to communicate between different agents and tools, enabling them to work together seamlessly.
The Agent-to-Agent (A2A) protocol from Google is a standardized method for agents to communicate and collaborate with each other. It allows agents to share tasks, delegate responsibilities, and exchange information in a structured manner.
LangGraph is a framework for building and deploying AI agents that can reason, plan, and execute tasks using natural language. It provides a set of tools and libraries for creating agents that can interact with various services and APIs, making it easier to build complex automation workflows.
Cisco pyATS (Python Automated Test Systems) is a framework for automating network testing and validation. It provides a set of tools and libraries for creating and executing tests on network devices, making it easier to validate configurations, performance, and functionality.
Streamlit is an open-source app framework for Machine Learning and Data Science projects. It allows you to create beautiful web applications for your data science projects with minimal effort. You can use Streamlit to build interactive dashboards, visualizations, and user interfaces for your AI agents.
MCPyATS is a framework that combines the Model Context Protocol (MCP), Agent-to-Agent (A2A) communication, LangGraph, and Cisco pyATS to create intelligent, distributed agents capable of automating network tasks via natural language. It includes multiple tool-integrated MCP servers, a LangGraph agent backend, a Streamlit UI frontend, and support for A2A (Agent-to-Agent) communication.
MCPyATS is designed to be modular and extensible, allowing you to easily add new tools and services as needed. It provides a standardized way to communicate between different agents and tools, enabling them to work together seamlessly.
MCPyATS is built on top of Docker, making it easy to deploy and run in any environment. It includes a set of pre-built Docker images for all the MCP servers and the LangGraph agent, making it easy to get started with minimal setup.
Out of the box the LLM is openAI chatGPT4o but you can change it to Gemini or any other LLM that is compatible with LangGraph. Find the llm=statement in the mcpyats.py file and change it to the LLM you want to use.
The testbed is ready to use with the Cisco DevNet Sandbox for CML. You can use the following link to access the sandbox and get started with the testbed.
You can use your own topology by changing the testbed.yaml file in the mcpyats directory. You can also use the testbed.yaml file in the shared_output directory to save the testbed file generated by the agent. If you have a CML of your own (v2) you can use the REST API to generate the testbed file.
The repository is organized into several components, each serving a specific purpose in the MCPyATS ecosystem. Below is a high-level overview of the structure:
๐ Repository Structure
MCPyATS/
โโโ a2a/ # Agent-to-Agent Adapter implementation
โโโ drawio/ # Custom local Draw.io instance with MCP integration
โโโ mcp_servers/ # All MCP tool servers (Slack, GitHub, ServiceNow, etc.)
โโโ mcpyats/ # LangGraph-based MCP agent using pyATS and tools
โโโ streamlit/ # Streamlit web frontend on port :8501
โโโ shared_output/ # Volume-mounted folder for shared artifact storage
โโโ .env.example # Sample environment file (copy to .env and update)
โโโ docker_startup.sh # Startup script to launch
๐ ๏ธ Quick Start Clone the repo
git clone https://github.com/automateyournetwork/MCPyATS
cd MCPyATS
Prepare your environment
Create your .env file based on the sample provided:
cp .env.example .env
Update all the required API keys, tokens, and paths inside .env.
Launch the system
./docker_startup.sh
This will bring up:
๐ Component Breakdown
๐ a2a/ โ Agent-to-Agent Adapter
๐งฉ mcp_servers/ โ Tool Servers
๐ drawio/ โ Local Draw.io Instance
๐ง mcpyats/ โ MCPyATS Agent
๐๏ธ shared_output/ โ Shared Volume
๐ streamlit/ โ Frontend UI
๐ Environment Variables
๐ .env Variables
๐ง Core Agent Configuration
PYATS_TESTBED_PATH= # Path to your pyATS testbed YAML file
๐ LLM & LangSmith
OPENAI_API_KEY= # OpenAI API Key for GPT-3.5 or GPT-4
GOOGLE_API_KEY= # Google API Key for Gemini LLM
LANGSMITH_TRACING= # LangSmith tracing enabled (true/false)
LANGSMITH_API_KEY= # LangSmith API key
LANGSMITH_ENDPOINT= # LangSmith endpoint
LANGSMITH_PROJECT= # LangSmith project name
๐ Enrichment APIs
WEATHER_API_KEY= # OpenWeatherMap API Key
ABUSEIPDB_API_KEY= # AbuseIPDB API Key
๐งฐ GitHub MCP
GITHUB_USERNAME= # GitHub username
GITHUB_TOKEN= # GitHub token
๐บ๏ธ Google Maps MCP
GOOGLE_MAPS_API_KEY= # Google Maps API key
๐ฌ Slack MCP
SLACK_BOT_TOKEN= # Slack bot token
SLACK_TEAM_ID= # Slack team ID
SLACK_CHANNEL_ID= # Slack channel ID
๐ Selector MCP
SELECTOR_AI_API_KEY= # Selector AI API key
SELECTOR_URL= # Selector API URL
๐พ FileSystem MCP
FILESYSTEM_PATH= # Local path mounted into the FileSystem
๐ง NetBox MCP
NETBOX_URL= # NetBox API URL
NETBOX_TOKEN= # NetBox API token
๐ ๏ธ ServiceNow MCP
SERVICENOW_URL= # ServiceNow instance URL
SERVICENOW_USERNAME= # ServiceNow username
SERVICENOW_PASSWORD= # ServiceNow password
โ๏ธ Email MCP
EMAIL_HOST= # SMTP server host (e.g., smtp.gmail.com)
EMAIL_PORT= # SMTP server port (e.g., 587 for TLS)
EMAIL_SSL= # Enable SSL (true/false)
EMAIL_ACCOUNT= # Email address to send from
EMAIL_PASSWORD= # Password for the email account
๐ก๏ธ NIST MCP
ISE_BASE=https://devnetsandboxise.cisco.com
USERNAME=readonly
PASSWORD=ISEisC00L
๐ก๏ธ ISE MCP
NVD_API_KEY= # NIST NVD API key
๐ A2A Adapter
AGENT_CARD_OUTPUT_DIR= # Path to save .well-known/agent.json
A2A_PEER_AGENTS= # Comma-separated list of peer agent URLs
๐ง LangGraph Agent (mcpyats/)
๐ง What It Does
๐ Key Files
๐งฑ Architecture
START
โ
[select_tools] โโ Vector + LLM-assisted tool filtering
โ
[assistant] โโ LLM bound to selected tools and context
โ
[tools] โโ Executes structured tool
calls via MCP or delegation
โ
[handle_tool_results] โโ Updates state, context,
summaries
โ
[assistant] โโ Final reply or new tool
request
โ
END or tools again
โ๏ธ Tool Discovery Logic
tool_services = [
("pyats-mcp", ["python3", "pyats_mcp_server.py", "--oneshot"], "tools/discover", "tools/call"),
("drawio-mcp", "http://host.docker.internal:11434/rpc", "tools/list", "tools/call"),
...
]
Supported types:
๐งฌ A2A Agent Card
{
"name": "pyATS Agent",
"description": "Cisco pyATS Agent with access to many MCP tools",
"url": "https://your-agent-url",
"methods": {
"send": "https://your-agent-url/"
},
"capabilities": {
"a2a": true,
"tool-use": true,
"chat": true,
"push-notifications": true
},
"skills": [
{
"id": "pyATS_run_show_command",
"description": "Runs a show command on a device"
},
...
]
}
This allows other A2A agents to discover and delegate tasks to your MCPyATS instance.
๐งช Local Development
cd mcpyats/
langgraph dev --host 0.0.0.0 --port 2024
LangGraph will use the langgraph.json file:
{
"entrypoints": {
"MCpyATS": "mcpyats:compiled_graph"
},
...
}
๐ง LangGraph Agent: How It Works
๐งฉ Agent Flow Diagram (from Studio UI)
Here's the LangGraph pipeline you uploaded (langgraph_studio.png), annotated:
START
โ
select_tools โ Vector search + LLM filtering
โ
assistant โ LLM + tool binding (uses selected tools)
โ โ
tools โ handle_tool_results
โโโโโโโโโโโโโฉ
|
โ
END โ if no tool call is required
๐ Breakdown of Each Node
How:
Why:
How:
How:
How:
๐ Hosted Interface Links (update IP for local use)
๐ Summary of Technologies
The mcp_servers/
directory houses all modular MCP-compliant tool servers. Each folder represents an individual service that exposes structured tools over STDIO or HTTP, and can be independently containerized.
๐ฆ Included MCP Servers:
chatgpt
โ Interface with OpenAI models via structured tool calls
โ ./mcp_servers/chatgpt/README.md
drawio_mcp
โ Communicate with a browser-based Draw.io instance using WebSockets or HTTP
โ ./mcp_servers/drawio_mcp/README.md
email
โ Send email using SMTP
โ ./mcp_servers/email/README.md
excalidraw
โ Generate freeform drawings and diagrams
โ ./mcp_servers/excalidraw/README.md
filesystem
โ Read, write, edit, and manage files on the host system
โ ./mcp_servers/filesystem/README.md
github
โ Perform GitHub repo actions like push, commit, or create issues
โ ./mcp_servers/github/README.md
google_maps
โ Geocoding and location data tools
โ ./mcp_servers/google_maps/README.md
google_search
โ Perform Google search queries
โ ./mcp_servers/google_search/README.md
mermaid
โ Generate diagrams from Mermaid syntax
โ ./mcp_servers/mermaid/README.md
vegalite
โ Create charts and plots from VegaLite specs
โ ./mcp_servers/vegalite/README.md
quickchart
โ Generate standard charts via QuickChart
โ ./mcp_servers/quickchart/README.md
netbox
โ Query and manipulate NetBox resources
โ ./mcp_servers/netbox/README.md
nist
โ Look up CVEs and vulnerabilities from the NIST NVD
โ ./mcp_servers/nist/README.md
pyats_mcp_server
โ Run pyATS commands, parse configs, test connectivity
โ ./mcp_servers/pyats_mcp_server/README.md
rfc
โ Search and retrieve IETF RFC documents
โ ./mcp_servers/rfc/README.md
sequentialthinking
โ Logical task planning and chaining
โ ./mcp_servers/sequentialthinking/README.md
servicenow
โ Create and manage ServiceNow tickets
โ ./mcp_servers/servicenow/README.md
slack
โ Post messages or react to threads in Slack
โ ./mcp_servers/slack/README.md
subnet-calculator
โ Calculate subnets from a CIDR notation IP
โ ./mcp_servers/slack/README.md
ise
โ Cisco Identity Services Engine (ISE) integration for network access control
โ ./mcp_servers/slack/README.md
Each server folder includes a Dockerfile
and a server.py
or index.ts
(depending on language/runtime) that defines the available tools and communication logic.
This component exposes your LangGraph-based MCP agent to the public web using the Agent-to-Agent (A2A) protocol. It enables cross-agent collaboration by accepting and delegating structured tasks via standard JSON-RPC.
It also supports push notifications, Slack integration, and public discovery using .well-known/agent.json
.
send
method.well-known/agent.json
for agent discoverya2a/ โโโ a2a_adapter.py # Main FastAPI application (the adapter) โโโ Dockerfile # Container for exposing the A2A adapter โโโ requirements.txt # Required Python packages โโโ .well-known/ โ โโโ agent.json # This agent's public card for peer discovery
:2024
)Your .well-known/agent.json
file exposes your agentโs identity, capabilities, and tools:
{
"name": "pyATS Agent",
"description": "Cisco pyATS Agent with access to many MCP tools",
"url": "https://your-ngrok-or-public-url",
"methods": {
"send": "https://your-ngrok-or-public-url/"
},
"capabilities": {
"a2a": true,
"tool-use": true,
"chat": true,
"push-notifications": true
},
"skills": [
{
"id": "pyATS_run_show_command",
"description": "Runs a show command on a device"
}
...
]
}
โจ Features
โ Peer Discovery
๐ง Delegation
๐ Artifact Hosting
๐ฃ Push Notifications
๐ Security Notes
๐งฑ Based On
โ Next Steps
๐ docker_startup.sh โ One-Command Launcher for MCPyATS
mcp_servers/
(tool containers)mcpyats
agentFor each directory in ./mcp_servers/
, this script:
docker build
github-mcp
, servicenow-mcp
)Includes servers for:
mcpyats
โ The LangGraph agenta2a-adapter
โ A2A FastAPI adapterdrawio
and drawio-mcp
โ for visual topology editingstreamlit-app
โ web UI on http://localhost:8501
.env
for tokens and keysshared_output/
)--env-file .env
where applicable10000
, 2024
, 8501
, 8080
, 3000
, 11434
)Ensure you have a valid .env
file configured (copy from .env.example
):
cp .env.example .env
nano .env # fill in tokens and secrets
Then run:
./docker_startup.sh
๐ Streamlit UI โ Frontend for MCpyATS
This directory provides a lightweight Streamlit-based web interface for interacting with the MCpyATS LangGraph AI agent.
It enables natural language chat with your network and connected MCP tools (via HTTP), and provides a visual entry point for interacting with all your infrastructure agents.
:2024
via /threads/.../runs/stream
APIpyats_mcp_server
, drawio
, github
, filesystem
, etc.logo.jpeg
streamlit/
โโโ Dockerfile # Builds and runs Streamlit app
โโโ streamlit.py # Main Streamlit frontend logic
โโโ logo.jpeg # Logo displayed at the top of the app
๐ณ Docker Usage
You can build and run the Streamlit app using the included Dockerfile:
```bash
docker build -t streamlit-app ./streamlit
docker run -d --name streamlit-app -p 8501:8501 streamlit-app
Then open:
http://localhost:8501
๐งช Local Dev Usage
If you prefer to run locally with Python:
pip install streamlit requests
cd streamlit/
streamlit run streamlit.py
โ๏ธ Configuration
The frontend expects your LangGraph agent to be running at:
http://host.docker.internal:2024
This is set in the code via:
API_BASE_URL = "http://host.docker.internal:2024"
You can change this to http://localhost:2024 or a public agent URL if needed.
๐ฌ Prompting Guide
Some examples you can use in the prompt box:
๐ง State Management
๐ฆ Dependencies
Already included in the Dockerfile:
RUN pip install requests
RUN pip install streamlit
๐ Development Notes -Runs on port 8501 -Accepts prompt input via st.chat_input -Supports both raw text replies and structured tool messages -Gracefully handles tool call responses, JSON decoding, and missing fields -Can be extended to support file uploads and session tagging
๐ Access
Once running, access the UI via:
http://localhost:8501
If deployed remotely, update your .env and reverse proxy as needed.
Once you run the ./docker_startup.sh
script successfully, MCPyATS will:
โ Build and start 20+ Docker containers
โ Expose local and web-accessible endpoints
โ Enable real-time interaction across AI, automation, visualization, and collaboration layers
Below is a complete list of the key containers, their roles, and exposed ports:
Service | Container Name | Description | Port(s) |
---|---|---|---|
LangGraph Agent | mcpyats | Core LangGraph-based AI orchestrator | :2024 |
Streamlit UI | streamlit-app | Web frontend for natural language interface | :8501 |
Draw.io (MCP) | drawio-mcp | MCP server with WebSocket + TCP tool bridge | 3000 , 11434 |
Draw.io (Local) | drawio-local | Browser-accessible standalone Draw.io instance | :8080 |
A2A Adapter | a2a-adapter | Agent-to-Agent communication adapter | :10000 |
The following MCP containers are launched in the background and made available to the LangGraph agent:
MCP Server | Container ID | Container Name | Description |
---|---|---|---|
Filesystem | filesystem-mcp | Read/write local files | |
NetBox | netbox-mcp | Query network source-of-truth | |
Google Search | google-search-mcp | Web search interface | |
ServiceNow | servicenow-mcp | Create/update problem tickets | |
Email (SMTP) | email-mcp | Send outbound emails | |
ChatGPT | chatgpt-mcp | Interface with OpenAI GPT models | |
pyATS | pyats-mcp | Execute network commands | |
QuickChart | quickchart-mcp | Generate simple charts | |
VegaLite | vegalite-mcp | Create JSON-driven charts | |
Mermaid | mermaid-mcp | Render Mermaid syntax diagrams | |
RFC | rfc-mcp | Retrieve IETF RFCs | |
NIST NVD | nist-mcp | Vulnerability and CVE data | |
Subnet Calculator | subnet-calculator-mcp | Calculates subnet info from CIDR IP | |
Excalidraw | excalidraw-mcp | Generate freeform drawings | |
GitHub | github-mcp | Interact with GitHub repositories | |
Slack | slack-mcp | Post messages to Slack | |
Google Maps | google-maps-mcp | Geocoding and location data | |
AbuseIPDB | abuseipdb-mcp | Query IP reputation | |
Selector AI | selector-ai-mcp | AI-based selector for data | |
Sequential Thinking | sequentialthinking-mcp | Logical task planning | |
Cisco Identity Services Engine | ise-mcp | ISE Operations |
When complete, you should see output like:
โ
drawio-mcp container running with both STDIO + WebSocket
โ
mcpyats container started at http://localhost:2024
โ
streamlit-app container started at http://localhost:8501
โ
local drawio container started at http://localhost:8080
โ
a2a-adapter container listening on http://localhost:10000
You can also verify running containers with:
docker ps
๐งช Test the System
Visit http://localhost:8501 โ Interact with MCpyATS via Streamlit
Use curl http://localhost:2024/docs to see API docs
Open http://localhost:8080 for standalone Draw.io
Post JSON to http://localhost:10000 to test A2A messaging
Watch logs with docker logs -f mcpyats
๐ Restarting
If you make code changes, restart individual containers with:
docker restart mcpyats
docker restart streamlit-app
Or restart all:
docker rm -f $(docker ps -aq)
./docker_startup.sh
๐ Shared Output
Many containers mount and use this shared volume:
/home/johncapobianco/MCPyATS/shared_output โ /projects or /output
You should adjust this to your path
This allows tools like Filesystem MCP, Chart generators, and Draw.io to exchange files and artifacts in real time.
{ "mcpServers": { "mcpyats": { "command": "docker", "args": [ "build", "-t", "github-mcp", "./mcp_servers/github" ] } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories