A powerful multi-database server implementing the Model Context Protocol (MCP) to provide AI assistants with structured access to databases.
The DB MCP Server provides a standardized way for AI models to interact with multiple databases simultaneously. Built on the FreePeak/cortex framework, it enables AI assistants to execute SQL queries, manage transactions, explore schemas, and analyze performance across different database systems through a unified interface.
Unlike traditional database connectors, DB MCP Server can connect to and interact with multiple databases concurrently:
{
"connections": [
{
"id": "mysql1",
"type": "mysql",
"host": "localhost",
"port": 3306,
"name": "db1",
"user": "user1",
"password": "password1"
},
{
"id": "postgres1",
"type": "postgres",
"host": "localhost",
"port": 5432,
"name": "db2",
"user": "user2",
"password": "password2"
}
]
}
For each connected database, the server automatically generates specialized tools:
// For a database with ID "mysql1", these tools are generated:
query_mysql1 // Execute SQL queries
execute_mysql1 // Run data modification statements
transaction_mysql1 // Manage transactions
schema_mysql1 // Explore database schema
performance_mysql1 // Analyze query performance
The server follows Clean Architecture principles with these layers:
Database | Status | Features |
---|---|---|
MySQL | ✅ Full Support | Queries, Transactions, Schema Analysis, Performance Insights |
PostgreSQL | ✅ Full Support (v9.6-17) | Queries, Transactions, Schema Analysis, Performance Insights |
TimescaleDB | ✅ Full Support | Hypertables, Time-Series Queries, Continuous Aggregates, Compression, Retention Policies |
The DB MCP Server can be deployed in multiple ways to suit different environments and integration needs:
# Pull the latest image
docker pull freepeak/db-mcp-server:latest
# Run with mounted config file
docker run -p 9092:9092 \
-v $(pwd)/config.json:/app/my-config.json \
-e TRANSPORT_MODE=sse \
-e CONFIG_PATH=/app/my-config.json \
freepeak/db-mcp-server
Note: Mount to
/app/my-config.json
as the container has a default file at/app/config.json
.
# Run the server in STDIO mode
./bin/server -t stdio -c config.json
For Cursor IDE integration, add to .cursor/mcp.json
:
{
"mcpServers": {
"stdio-db-mcp-server": {
"command": "/path/to/db-mcp-server/server",
"args": ["-t", "stdio", "-c", "/path/to/config.json"]
}
}
}
# Default configuration (localhost:9092)
./bin/server -t sse -c config.json
# Custom host and port
./bin/server -t sse -host 0.0.0.0 -port 8080 -c config.json
Client connection endpoint: http://localhost:9092/sse
# Clone the repository
git clone https://github.com/FreePeak/db-mcp-server.git
cd db-mcp-server
# Build the server
make build
# Run the server
./bin/server -t sse -c config.json
Create a config.json
file with your database connections:
{
"connections": [
{
"id": "mysql1",
"type": "mysql",
"host": "mysql1",
"port": 3306,
"name": "db1",
"user": "user1",
"password": "password1",
"query_timeout": 60,
"max_open_conns": 20,
"max_idle_conns": 5,
"conn_max_lifetime_seconds": 300,
"conn_max_idle_time_seconds": 60
},
{
"id": "postgres1",
"type": "postgres",
"host": "postgres1",
"port": 5432,
"name": "db1",
"user": "user1",
"password": "password1"
}
]
}
# Basic syntax
./bin/server -t <transport> -c <config-file>
# SSE transport options
./bin/server -t sse -host <hostname> -port <port> -c <config-file>
# Inline database configuration
./bin/server -t stdio -db-config '{"connections":[...]}'
# Environment variable configuration
export DB_CONFIG='{"connections":[...]}'
./bin/server -t stdio
For each connected database, DB MCP Server automatically generates these specialized tools:
Tool Name | Description |
---|---|
query_<db_id> | Execute SELECT queries and get results as a tabular dataset |
execute_<db_id> | Run data manipulation statements (INSERT, UPDATE, DELETE) |
transaction_<db_id> | Begin, commit, and rollback transactions |
Tool Name | Description |
---|---|
schema_<db_id> | Get information about tables, columns, indexes, and foreign keys |
generate_schema_<db_id> | Generate SQL or code from database schema |
Tool Name | Description |
---|---|
performance_<db_id> | Analyze query performance and get optimization suggestions |
For PostgreSQL databases with TimescaleDB extension, these additional specialized tools are available:
Tool Name | Description |
---|---|
timescaledb_<db_id> | Perform general TimescaleDB operations |
create_hypertable_<db_id> | Convert a standard table to a TimescaleDB hypertable |
list_hypertables_<db_id> | List all hypertables in the database |
time_series_query_<db_id> | Execute optimized time-series queries with bucketing |
time_series_analyze_<db_id> | Analyze time-series data patterns |
continuous_aggregate_<db_id> | Create materialized views that automatically update |
refresh_continuous_aggregate_<db_id> | Manually refresh continuous aggregates |
For detailed documentation on TimescaleDB tools, see TIMESCALEDB_TOOLS.md.
-- Query the first database
query_mysql1("SELECT * FROM users LIMIT 10")
-- Query the second database in the same context
query_postgres1("SELECT * FROM products WHERE price > 100")
-- Start a transaction
transaction_mysql1("BEGIN")
-- Execute statements within the transaction
execute_mysql1("INSERT INTO orders (customer_id, product_id) VALUES (1, 2)")
execute_mysql1("UPDATE inventory SET stock = stock - 1 WHERE product_id = 2")
-- Commit or rollback
transaction_mysql1("COMMIT")
-- OR
transaction_mysql1("ROLLBACK")
-- Get all tables in the database
schema_mysql1("tables")
-- Get columns for a specific table
schema_mysql1("columns", "users")
-- Get constraints
schema_mysql1("constraints", "orders")
query_timeout
setting in your configurationEnable verbose logging for troubleshooting:
./bin/server -t sse -c config.json -v
We welcome contributions to the DB MCP Server project! To contribute:
git checkout -b feature/amazing-feature
)git commit -m 'feat: add amazing feature'
)git push origin feature/amazing-feature
)Please see our CONTRIBUTING.md file for detailed guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
No configuration available
This service may require manual configuration, please check the details on the left
Related projects feature coming soon
Will recommend related projects based on sub-categories