
mcp sandbox
展示通过模型上下文协议(MCP)实现的天气信息服务,支持语言模型与外部工具交互。
Repository Info
About This Server
展示通过模型上下文协议(MCP)实现的天气信息服务,支持语言模型与外部工具交互。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Weather MCP: Model Context Protocol Example
This project demonstrates an implementation of the Model Context Protocol (MCP) for a weather information service. It shows how language models (LLMs) can interact with external tools through a standardized interface.
What is MCP?
The Model Context Protocol (MCP) is an open standard that enables language models to discover and use tools during a conversation. MCP provides:
- Standard interfaces for tools and LLMs to communicate
- Runtime discovery of available tools and their capabilities
- Consistent tooling across different models and platforms
Project Architecture
This project consists of two main components:
1. MCP Server (weather.py)
The server provides weather tools through the MCP protocol:
- Uses the
FastMCPlibrary to register and expose tools - Communicates via stdin/stdout (stdio transport)
- Provides two main tools:
get_alerts: Retrieves weather alerts for a US stateget_forecast: Gets weather forecasts for a location by coordinates
- Fetches data from the National Weather Service (NWS) API
2. MCP Client (test_client.py)
The client coordinates between a language model (Claude) and the MCP server:
- Spawns the server process and establishes communication
- Queries available tools from the server
- Exposes these tools to Claude through the Anthropic API
- Executes tool calls requested by Claude
- Returns tool results back to Claude for continued processing
Client-Server Interaction Flow
The MCP architecture enables a seamless interaction between LLMs and tools:
-
Initialization:
- Client spawns the server process
- Client establishes a connection via stdio
- Client calls
initializeto start the MCP session - Client discovers available tools via
list_tools
-
Tool Registration:
- Server registers functions as tools using the
@mcp.tool()decorator - Each tool has a name, description, and input schema
- Tools are exposed through the MCP interface
- Server registers functions as tools using the
-
User Interaction:
- User submits a query to the client
- Client forwards the query to Claude with available tools
-
Tool Execution Cycle:
- Claude decides whether to use tools based on the query
- If tools are needed, Claude returns a tool_use object
- Client sends tool call to the server via MCP's
call_tool - Server executes the tool and returns results
- Client forwards tool results back to Claude
- Claude incorporates tool results into its response
-
Response Generation:
- Claude processes all tool results
- Claude generates a final response for the user
- Client displays the final response
Setup and Usage
Prerequisites
- Python 3.9+
- An Anthropic API key (for Claude)
- UV - Fast Python package installer and resolver
Installation
- Clone this repository
- Set up with UV:
# Install dependencies using UV uv sync # Or if using pyproject.toml uv pip install -e . - Create a
.envfile with your Anthropic API key:ANTHROPIC_API_KEY=your_api_key_here
Running the Application
- Start the client with the server script:
uv run test_client.py mcp_server_weather.py - Type weather-related queries, such as:
- "What's the weather forecast for San Francisco?"
- "Are there any weather alerts in CA?"
- "Will it rain tomorrow in Seattle? The coordinates are 47.6062, -122.3321"
Using the MCP Inspector
For debugging and testing your MCP server, you can use the MCP Inspector:
-
Run the inspector with your server script:
npx @modelcontextprotocol/inspector python weather.py -
This will open a web UI in your browser where you can:
- View available tools and their schemas
- Test tool calls directly without needing an LLM
- Debug server behavior and responses
Technical Details
Server Implementation
The server uses the FastMCP class to register Python functions as tools:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("weather")
@mcp.tool()
async def get_forecast(latitude: float, longitude: float) -> str:
# Implementation...
Client Implementation
The client manages the tool execution cycle:
# Execute tool call through MCP
result = await self.session.call_tool(tool_name, tool_args)
# Add tool result to conversation history
messages.append({
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": content.id,
"content": result.content
}
]
})
Resources
- MCP Documentation
- Anthropic API Documentation
- National Weather Service API
License
MIT License
Quick Start
Clone the repository
git clone https://github.com/samkeen/mcp-sandboxInstall dependencies
cd mcp-sandbox
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.