luketych
MCP Serverluketychpublic

playground mcp model_context_protocol

展示使用模型上下文协议(MCP)实现两个应用之间的通信,包含轻量级无状态MCP服务器。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Python
Language
-
License

About This Server

展示使用模型上下文协议(MCP)实现两个应用之间的通信,包含轻量级无状态MCP服务器。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

MCP Communication Project

Overview

This project demonstrates communication between two applications using the Model Context Protocol (MCP). It implements a Fast MCP style with a lightweight, stateless MCP Server acting as a middleman.

Components

  1. MCP Server: A FastAPI-based server that manages and routes context between applications using Fast MCP principles (minimal memory, stateless delivery).
  2. App A: An email summarization service that uses OpenAI's GPT-4 to process emails and forward them through MCP.
  3. App B: A reply generation service that uses Anthropic's Claude to generate responses based on received context.

Prerequisites

  • Python 3.8 or higher
  • OpenAI API key (for App A)
  • Anthropic API key (for App B)
  • Available ports (configured in config.py):
    • 9002 (MCP Server)
    • 8002 (App A)
    • 8003 (App B)

Installation

This project uses the uv package manager for faster dependency management.

  1. Install uv:
pip install uv
  1. Create and activate a virtual environment:
uv venv
source .venv/bin/activate  # On macOS
  1. Install the project and its dependencies:
uv pip install -e .

Configuration

  1. Configure OpenAI API key in app_a/llm_client.py:

    OPENAI_API_KEY = "your_openai_key"
    
  2. Configure Anthropic API key in app_b/llm_client.py:

    ANTHROPIC_API_KEY = "your_anthropic_key"
    

Running the Services

Start each service in a separate terminal (make sure your virtual environment is activated):

To start all services at once, simply run:

python main.py

This will start:

  • MCP Server on port 9002
  • App A on port 8002
  • App B on port 8003

Alternatively, you can start each service individually:

  1. Start the MCP Server:

    cd mcp_server
    python -m uvicorn app:app --port=9002
    
  2. Start App A:

    cd app_a
    python -m uvicorn app:app --port=8002
    
  3. Start App B:

    cd app_b
    python -m uvicorn app:app --port=8003
    

Testing the Setup

  1. Send a test email to App A:

    curl -X POST "http://localhost:8002/summarize" \
      -H "Content-Type: application/json" \
      -d '{"email":"Hello, I need help with my laptop refund."}'
    
  2. Check App B's response:

    curl http://localhost:8003/poll
    

Project Structure

/project-root
    /src
        /mcp_server
            app.py         # FastAPI app running the MCP server
            router.py      # Handles MCP routing with stateless delivery
        /app_a
            app.py         # API to trigger summarization
            llm_client.py  # OpenAI API client
            mcp_handler.py # Creates and sends MCP packages
        /app_b
            app.py         # Polls MCP server for messages
            llm_client.py  # Anthropic API client
            mcp_handler.py # Parses received MCP packages
        config.py          # Central configuration for ports and URLs
        main.py           # Process manager to run all services

Flow Diagram

  1. App A receives an email via /summarize
  2. App A summarizes it using OpenAI and builds an MCP package
  3. App A sends the MCP package to MCP Server
  4. App B polls the MCP Server via /poll
  5. MCP Server delivers the package to App B
  6. App B parses the package into a prompt
  7. App B uses Claude to generate a reply
  8. App B returns the generated reply

Fast MCP Implementation

This project implements Fast MCP principles:

  • Minimal memory usage
  • Stateless message delivery
  • Simple message queuing
  • Direct point-to-point routing

Quick Start

1

Clone the repository

git clone https://github.com/luketych/playground--mcp-model_context_protocol
2

Install dependencies

cd playground--mcp-model_context_protocol
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerluketych
Repoplayground--mcp-model_context_protocol
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation