A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.
Clone the repository:
git clone https://github.com/rakesh-eltropy/mcp-client.git
Navigate to the Project Directory After cloning the repository, move to the project directory:
cd mcp-client
Set the OPENAI_API_KEY environment variable:
export OPENAI_API_KEY=your-openai-api-key
You can also set the OPENAI_API_KEY
in the mcp-server-config.json file.
You can also set the provider
and model
in the mcp-server-config.json file.
e.g. provider
can be ollama
and model
can be llama3.2:3b
.
4.Set the BRAVE_API_KEY environment variable:
export BRAVE_API_KEY=your-brave-api-key
You can also set the BRAVE_API_KEY
in the mcp-server-config.json file.
You can get the free BRAVE_API_KEY
from Brave Search API.
Running from the CLI:
uv run cli.py
To explore the available commands, use the help
option. You can chat with LLM using chat
command.
Sample prompts:
What is the capital city of India?
Search the most expensive product from database and find more details about it from amazon?
Running from the REST API:
uvicorn app:app --reload
You can use the following curl command to chat with llm:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
You can use the following curl command to chat with llm with streaming:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
Feel free to submit issues and pull requests for improvements or bug fixes.
No configuration available
Related projects feature coming soon
Will recommend related projects based on sub-categories