n-sviridenko
MCP Servern-sviridenkopublic

langgraph a2a mcp example

展示如何使用 LangGraph 和 MCP 构建兼容 A2A 协议的应用程序。

Repository Info

3
Stars
1
Forks
3
Watchers
1
Issues
Python
Language
-
License

About This Server

展示如何使用 LangGraph 和 MCP 构建兼容 A2A 协议的应用程序。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

LangGraph A2A MCP Example

This repository demonstrates how to build an A2A Protocol-compatible application using LangGraph with Multi-Channel Protocol (MCP) capabilities.

Video Overview

Video Overview

Architecture

graph TD
    A2AClient[A2A Client] -->|A2A Protocol| A2AAdapter[LangGraph A2A Adapter]
    A2AAdapter -->|LangGraph Server API| Graph[LangGraph Agent Graph]
    Graph -->|MCP| Tavily[Tavily Search]
    
    subgraph "This Repository"
        Graph
        Tavily
    end
    
    style A2AClient fill:#e1e8ed,stroke:#333,stroke-width:1.5px
    style A2AAdapter fill:#9aadc2,stroke:#333,stroke-width:1.5px
    style Graph fill:#c2dcf2,stroke:#333,stroke-width:1.5px
    style Tavily fill:#d3e0ea,stroke:#333,stroke-width:1.5px

Overview

LangGraph is a library for building stateful, multi-actor applications with LLMs. This example shows how to create a LangGraph application that is compatible with the A2A (Agent-to-Agent) Protocol, enabling it to communicate with any A2A-compatible client.

Key features:

  • A2A Protocol Support: Seamless integration with A2A-compatible clients through the adapter
  • Multi-Channel Protocol (MCP): Support for structured communication between AI systems using LangChain MCP Adapters
  • Stateful Conversation: Built-in support for persistent state, checkpoints, and multi-step interactions
  • Human-in-the-Loop: Capability for both autonomous operation and human collaboration

Setup Requirements

You'll need to set up two components:

  1. This LangGraph application - The actual agent implementation
  2. LangGraph A2A Adapter - Translates between A2A protocol and LangGraph API (GitHub repo)

1. Setting up the LangGraph Application

  1. Clone the repository:

    git clone <repository-url>
    cd langgraph-a2a-mcp-example
    
  2. Create and activate a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install dependencies:

    pip install -r my_agent/requirements.txt
    
  4. Create a .env file with your API keys:

    cp .env.example .env
    

    Then add your API keys for Anthropic, Tavily, and OpenAI.

2. Setting up the A2A Adapter

  1. Clone the adapter repository:

    git clone https://github.com/n-sviridenko/langgraph-a2a-adapter.git
    cd langgraph-a2a-adapter
    
  2. Follow the installation instructions in the adapter's README.

  3. Create a .env file with the following configuration:

    # LangGraph Connection
    LANGGRAPH_API_URL=http://localhost:2024
    
    # A2A Server Configuration
    A2A_PUBLIC_BASE_URL=http://localhost:8000
    A2A_PORT=8000
    
    # Agent Card Configuration
    AGENT_NAME="Weather Assistant"
    AGENT_DESCRIPTION="An AI assistant that provides weather information, forecasts, and related climate data."
    AGENT_VERSION=1.0.0
    AGENT_SKILLS='[{"id":"weather_info","name":"Weather Information","description":"Get current weather conditions for any location","examples":["What\'s the weather like in New York?","Is it raining in London right now?"]},{"id":"weather_forecast","name":"Weather Forecast","description":"Get weather forecasts for upcoming days","examples":["What\'s the forecast for Tokyo this weekend?","Will it snow in Chicago next week?"]}]'
    

Running the System

  1. Start the LangGraph application:

    langgraph dev
    
  2. In a separate terminal, start the A2A adapter:

    cd langgraph-a2a-adapter
    python main.py
    
  3. Connect any A2A-compatible client to the adapter at http://localhost:8000. You can use the Google A2A Demo Web App for testing.

A2A Integration

The A2A adapter provides:

  • Agent discovery through standard A2A agent cards
  • Message exchange with assistants
  • Task management
  • Streaming responses
  • Push notifications for task updates

Deployment

In order to deploy this agent to LangGraph Cloud you will want to first fork this repo. After that, you can follow the instructions here to deploy to LangGraph Cloud.

For the A2A Adapter, see the deployment instructions in its repository.

Quick Start

1

Clone the repository

git clone https://github.com/n-sviridenko/langgraph-a2a-mcp-example
2

Install dependencies

cd langgraph-a2a-mcp-example
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownern-sviridenko
Repolanggraph-a2a-mcp-example
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation