MCP client for local ollama models
Ollama MCP is a tool for connecting Ollama-based language models with external tools and services using the Model Context Protocol (MCP). This integration enables LLMs to interact with various systems like Git repositories, shell commands, and other tool-enabled services.
# Create a virtual environment
uv add ruff check
# Activate the virtual environment
source .venv/bin/activate
# Install the package in development mode
uv pip install -e .
Before the main application starts, you will be prompted to select an Ollama model to use.
Prerequisites:
ollama pull llama3.1:8b
). A list of models that support tool usage can be found on the Ollama website.Startup Process:
uv run main.py
pytest -xvs tests/test_ollama_toolmanager.py
This will start an interactive CLI where you can ask the assistant to perform Git operations.
You can extend the system by:
OllamaToolManager
# Creating a Git-enabled agent
git_params = StdioServerParameters(
command="uvx",
args=["mcp-server-git", "--repository", "/path/to/repo"],
env=None
)
# Connect and register tools
async with MCPClient(git_params) as client:
# Register tools with the agent
# Use the agent for Git operations
{ "mcpServers": { "ollama-mcp-client": { "command": "uvx", "args": [ "mcp-server-git", "--repository", "/path/to/repo" ] } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories