
weather_mcp_agent
基于FastAPI和MCP协议的天气预报与警报服务。
Repository Info
About This Server
基于FastAPI和MCP协议的天气预报与警报服务。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Weather MCP Agent (FastAPI + MCP SSE)
This project demonstrates a complete setup of a FastAPI-based MCP (Model Context Protocol) server paired with a FastAgent client using Server-Sent Events (SSE). It provides weather alerts and forecasts via the National Weather Service API.
Project Structure
.
├── README.md
├── agent.py # FastAgent client entrypoint
├── fastagent.config.yaml # FastAgent configuration (MCP servers, agent name)
├── fastagent.secrets.yaml # Secret keys (OpenAI, Anthropic, etc.)
├── pyproject.toml # Project dependencies (setuptools/poetry)
├── requirements.txt # (Optional) pip install list
├── src/
│ ├── __init__.py
│ ├── main.py # FastAPI server setup & SSE mount
│ ├── routes.py # HTTP endpoints: /, /about, /status
│ ├── weather.py # FastMCP server config & @mcp.tool functions
│ └── ... # egg-info, pycache, etc.
└── uv.lock # Lockfile for `uv` tool
1. Clone the Repository
Begin by cloning the repository to your local machine:
git clone git@github.com:sergeychernyakov/weather_mcp_agent.git
cd weather_mcp_agent
2. Create a Virtual Environment
It's recommended to use a virtual environment to manage dependencies:
python3 -m venv .venv
3. Activate the Virtual Environment
Activate the virtual environment before installing dependencies.
-
Linux/MacOS:
source .venv/bin/activate -
Windows:
.venv\Scripts\activate
Installing Dependencies
Requires Python 3.12+ and uv or uvicorn.
# Using uv:
uv pip install -r pyproject.toml
# Or directly with pip:
pip install -r requirements.txt
If you don’t have a requirements.txt, install manually:
pip install fastapi httpx mcp[cli] unicorn fast-agent-mcp
Running the FastAPI MCP Server
Launch the server before starting the agent:
uvicorn src.main:app --reload
The server exposes:
GET /→ HTML welcome pageGET /about→ Plain text infoGET /status→ JSON statusPOST /messages→ Internal endpoint for MCP tool callsGET /sse→ SSE endpoint for MCP clients
Running the FastAgent Client
The agent.py script launches a FastAgent that connects to the MCP server via SSE:
# Ensure the server is running, then in a new shell:
uv run agent.py # uses fastagent.config.yaml by default
# Or explicitly:
uv run agent.py -- --debug
Interactive Tool Usage
Once the agent prompt appears:
default >
You can invoke your weather tools directly:
# Fetch active alerts for Texas
!get_weather_alerts state="TX"
# Fetch a point forecast by coordinates
!get_weather_forecast latitude=29.76 longitude=-95.36
- The leading
!tells FastAgent to treat it as a tool call (not a chat). state="TX"must be in quotes; numeric args can be unquoted.- Responses from the MCP server will stream back via SSE and print in your console.
Debugging & Development
- Inspect tools locally with MCP Inspector:
mcp dev ./src/weather.py
Opens UI at http://127.0.0.1:6274 and proxies SSE.
- Agent debug mode:
uv run agent.py -- --debug
Shows connection logs, SSE events, and tool calls.
🧑💻 Using with ChatGPT (Function Calling via OpenAPI)
You can use this server with ChatGPT custom GPTs or assistants that support OpenAPI schema-based tool calling.
1. Expose your server publicly
Run ngrok to expose your local FastAPI server:
ngrok http 8000
This will provide you with a public HTTPS URL like:
https://127c-156-253-249-23.ngrok-free.app
2. Import OpenAPI in GPT Builder
- Open ChatGPT → Explore GPTs.
- Choose your custom GPT or create a new one.
- Under "Actions" → click "Import from URL".
- Paste:
https://127c-156-253-249-23.ngrok-free.app/openapi.json
- ChatGPT will fetch the schema and automatically detect:
get_weather_alerts(state: string)get_weather_forecast(latitude: number, longitude: number)
- Save and try asking:
What's the weather forecast in New York?
ChatGPT will invoke your FastAPI MCP server and return the result.
2. Function-Calling with the OpenAI API
If you’re writing your own ChatGPT-style client, just pass the OpenAPI schema to the ChatCompletion.create() call:
import openai
import json
with open("openapi.json") as f:
spec = json.load(f)
response = openai.ChatCompletion.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Show me weather alerts for CA"}],
functions=spec["paths"],
function_call="auto",
)
The model will then invoke your /messages → /sse pipeline under the hood and return the tool’s JSON response.
Additional Notes
- The configuration file (
fastagent.config.yaml) must match agent name and server keys. - Ensure the
weatherMCP server is defined undermcp:in the config. fast-agent-mcpversion ≥ 0.2.19 is required for SSE transport.
Author
Sergey Chernyakov
📬 Telegram: @AIBotsTech
Quick Start
Clone the repository
git clone https://github.com/sergeychernyakov/weather_mcp_agentInstall dependencies
cd weather_mcp_agent
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.