An MCP Server for WolframAlpha's LLM API, able to return structured knowledge & solve math
A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation
ask_llm
: Ask WolframAlpha a question and get a structured llm-friendly responseget_simple_answer
: Get a simplified answervalidate_key
: Validate the WolframAlpha API keygit clone https://github.com/Garoth/wolframalpha-llm-mcp.git
npm install
Get your WolframAlpha API key from developer.wolframalpha.com
Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):
{
"mcpServers": {
"wolframalpha": {
"command": "node",
"args": ["/path/to/wolframalpha-mcp-server/build/index.js"],
"env": {
"WOLFRAM_LLM_APP_ID": "your-api-key-here"
},
"disabled": false,
"autoApprove": [
"ask_llm",
"get_simple_answer",
"validate_key"
]
}
}
}
The tests use real API calls to ensure accurate responses. To run the tests:
Copy the example environment file:
cp .env.example .env
Edit .env
and add your WolframAlpha API key:
WOLFRAM_LLM_APP_ID=your-api-key-here
Note: The .env
file is gitignored to prevent committing sensitive information.
Run the tests:
npm test
npm run build
MIT
No configuration available
This service may require manual configuration, please check the details on the left
Related projects feature coming soon
Will recommend related projects based on sub-categories