
mcp_demo
展示如何使用 Python 和 Docker 实现 Anthropic 的模型上下文协议 (MCP)。
Repository Info
About This Server
展示如何使用 Python 和 Docker 实现 Anthropic 的模型上下文协议 (MCP)。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Demo
This project demonstrates Anthropic's Model Context Protocol (MCP) using Python and Docker. It showcases the client-server architecture of MCP with practical tool examples.
What is MCP?
The Model Context Protocol (MCP) is an open standard created by Anthropic for connecting AI models to external data sources and tools. It standardizes how applications provide context to Large Language Models (LLMs).
Project Structure
mcp-demo/
├── docker-compose.yml
├── .env.example # Template for environment variables
├── .gitignore # Git ignore file
├── CLAUDE.md # Guidelines for Claude Code
├── server/
│ ├── Dockerfile
│ ├── requirements.txt
│ └── server.py
└── client/
├── Dockerfile
├── requirements.txt
└── client.py
Components
- MCP Server: Exposes tools and resources that can be used by an MCP client
- MCP Client: Connects to the server and accesses its tools and resources
Features
The demo includes several tools:
greet- A simple greeting tool that returns a personalized messageadd- A calculator that adds two numbers togethergenerate_story- A creative story generator with different themes and lengths
Prerequisites
- Docker and Docker Compose
- An Anthropic API key (optional, for advanced use cases)
Setup Instructions
-
Clone this repository:
git clone https://github.com/yourusername/mcp-demo.git cd mcp-demo -
(Optional) Create a
.envfile from the template:cp .env.example .env # Edit .env to add your Anthropic API key if needed -
Start the containers with Docker Compose:
docker-compose up --build -
The client will connect to the server and demonstrate:
- Listing available tools
- Calling the greeting and addition tools
- Generating creative stories with different themes and lengths
-
To stop the containers:
docker-compose down
Extending the Demo
This demo can be extended in several ways:
- Add more complex tools to the server (e.g., database access, external API integration)
- Connect the client to an actual Claude model using the Anthropic API
- Create a more interactive application with a web UI
- Implement tool streaming for real-time updates
Troubleshooting
- Connection Issues: Ensure Docker networking is correctly set up. The client container should be able to reach the server on the Docker network.
- API Key: If using Anthropic API integration, ensure your API key is correctly set in the
.envfile. - Logs: Check Docker logs for detailed error messages (
docker-compose logs).
Resources
- Model Context Protocol Documentation
- Python MCP SDK
- Anthropic's Claude Documentation
- Docker Networking Guide
Quick Start
Clone the repository
git clone https://github.com/chanino/mcp_demoInstall dependencies
cd mcp_demo
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.