MCP: Build Rich-Context AI Apps with Anthropic
A comprehensive collection of courses and tutorials for learning the Model Context Protocol (MCP) - an open protocol that standardizes how AI applications connect to external tools and data sources.
The Model Context Protocol (MCP) enables AI models to connect with external data sources, tools, and environments, allowing for seamless transfer of information and capabilities between AI systems and the broader digital world. MCP transforms the complex MรN integration problem into a simple M+N solution by providing a standard interface.
Status: โ Available
A comprehensive course covering MCP implementation with practical examples using Streamlit, Wikipedia, and arXiv integrations.
๐ Fundamentals & Tool Integration
3_streamlit_tool_use_arxiv.py
- Streamlit app with arXiv paper search functionality3_arxiv_mcp_server.py
- MCP server for arXiv paper search and information extraction4_streamlit_tool_use_wikipedia.py
- Streamlit app with Wikipedia integration4_wikipedia_mcp_server_sse.py
- Wikipedia MCP server with SSE transport4_wikipedia_mcp_server_stdio.py
- Wikipedia MCP server with STDIO transport๐ MCP Client Implementation
5_mcp_client.py
- Basic MCP client implementation5_streamlit_mcp_client.py
- Streamlit interface for single MCP server connection6_streamlit_mcp_client_multiple.py
- Advanced multi-server MCP client with Streamlit UI๐ Advanced Server Implementations
7_wikipedia_mcp_server_stdio_prompts_resources.py
- Full-featured Wikipedia server with prompts and resources7_wikipedia_mcp_client_prompts_resources_stdio.py
- Client for advanced Wikipedia server7_wikipedia_mcp_server_prompts_resources_sse copy.py
- SSE version with prompts and resources7_wikipedia_mcp_server_prompts_resources_streamable-http.py
- HTTP streamable version (Production recommended)server_config.json
- Multi-server configuration fileStatus: ๐ Coming Soon
Advanced data science and analytics integration with MCP protocol.
Status: ๐ Coming Soon
Machine learning model integration and deployment using MCP with Hugging Face ecosystem.
git clone https://github.com/davila7/mcp-courses.git
cd mcp-courses
pip install -r requirements.txt
pip install -r requirements.txt
# Copy and configure your API keys
cp .env.example .env
# Edit .env with your API keys (Anthropic, Brave Search, etc.)
# Start Wikipedia MCP server with STDIO
python deeplearning_course/4_wikipedia_mcp_server_stdio.py
# Single server client
streamlit run deeplearning_course/5_streamlit_mcp_client.py
# Multi-server client (recommended)
streamlit run deeplearning_course/6_streamlit_mcp_client_multiple.py
Transport Protocols Supported:
mcp-courses/
โโโ deeplearning_course/ # Complete MCP course materials
โ โโโ papers/ # arXiv paper storage
โ โโโ wiki_articles/ # Wikipedia article cache
โ โโโ server_config.json # Multi-server configuration
โโโ main.py # Repository entry point
โโโ requirements.txt # Python dependencies
โโโ pyproject.toml # Project configuration
Contributions are welcome! Please feel free to submit issues, feature requests, or pull requests.
This project is open source and available under the MIT License.
{ "mcpServers": { "mcp-courses": { "command": "python", "args": [ "deeplearning_course/4_wikipedia_mcp_server_stdio.py" ] } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories