No description available
DeFiLlama MCP is a powerful and flexible tool that provides a microservice-based API wrapper around the DeFi Llama ecosystem. It leverages the FastMCP framework to transform DeFi Llama's comprehensive DeFi data into easily accessible tool endpoints that can be integrated with various AI applications, including LLM agents and autonomous systems.
This project serves as a bridge between the rich data sources provided by DeFi Llama and the emerging needs of AI-driven applications in the Web3 space. By wrapping DeFi Llama's APIs in a standardized MCP (Microservice Communication Protocol) format, developers can quickly integrate real-time DeFi data into their AI systems without dealing with the complexities of direct API integration.
DeFiLlama MCP is built using a modular architecture that separates the data-fetching logic from the API interface. The core components include:
# Clone the repository
git clone https://github.com/demcp/defillama-mcp.git
cd defillama-mcp
# Create a virtual environment and install dependencies
uv venv
uv pip install -e .
# Run the server
uv run defillama.py
# Clone the repository
git clone https://github.com/demcp/defillama-mcp.git
cd defillama-mcp
# Build the Docker image
docker build -t defillama-mcp .
# Run the container
docker run -p 8080:8080 defillama-mcp
Once the server is running, it exposes several endpoints that can be used to interact with DeFi Llama data:
get_protocols
: Retrieve information about top DeFi protocols.get_protocol_tvl
: Get TVL data for a specific protocol.get_chain_tvl
: Access historical TVL data for a specific blockchain.get_token_prices
: Obtain current price information for specific tokens.get_pools
: List available liquidity pools.get_pool_tvl
: Get detailed information about a specific liquidity pool.import httpx
# Query the MCP server for protocol data
async def get_protocol_data(protocol_name: str):
async with httpx.AsyncClient() as client:
response = await client.post(
"http://localhost:8080/tools/get_protocol_tvl",
json={"protocol": protocol_name}
)
return response.json()
# Usage
import asyncio
result = asyncio.run(get_protocol_data("aave"))
print(result)
from langchain.agents import load_tools
from langchain.agents import initialize_agent
from langchain.llms import OpenAI
# Load DeFiLlama MCP tools
tools = load_tools(["defillama-mcp"], base_url="http://localhost:8080")
# Initialize an agent with the tools
llm = OpenAI(temperature=0)
agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
# Run the agent
agent.run("What is the current TVL of Uniswap?")
from autogen import Agent, ConversableAgent
financial_analyst = ConversableAgent(
name="FinancialAnalyst",
llm_config={
"tools": [
{
"name": "defillama_protocol_tvl",
"url": "http://localhost:8080/tools/get_protocol_tvl"
}
]
}
)
# The agent can now access DeFi data during its reasoning process
financial_analyst.initiate_chat("Analyze the TVL trends for Aave protocol")
Returns a list of DeFi protocols tracked by DeFi Llama.
Response:
[
{
"id": "ethereum:0x7fc66500c84a76ad7e9c93437bfc5ac33e2ddae9",
"name": "Aave",
"symbol": "AAVE",
"chain": "Ethereum",
"tvl": 6240000000
},
...
]
Get TVL information for a specific protocol.
Request:
{
"protocol": "aave"
}
Response:
{
"ethereum": 3240000000,
"polygon": 980000000,
"avalanche": 570000000,
"optimism": 450000000,
"arbitrum": 1000000000,
"total": 6240000000
}
Get historical TVL data for a blockchain.
Request:
{
"chain": "ethereum"
}
Response:
[
{
"date": "2023-01-01",
"tvl": 28500000000
},
{
"date": "2023-01-02",
"tvl": 28700000000
},
...
]
Contributions are welcome! Here's how you can help improve DeFiLlama MCP:
git checkout -b feature/amazing-feature
git commit -m 'Add some amazing feature'
git push origin feature/amazing-feature
This project is licensed under the MIT License - see the LICENSE file for details.
For questions, suggestions, or discussions about this project, please open an issue on GitHub or contact the maintainers:
Built with ❤️ for the Web3 and AI communities
uv run defillama.py
{ "mcpServers": { "demcp-defillama-mcp": { "command": "docker", "args": [ "run", "-p", "8080:8080", "defillama-mcp" ] } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories