zemskymax
MCP Serverzemskymaxpublic

bridge_ai

AI Agent collaboration space

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Python
Language
Apache License 2.0
License

About This Server

AI Agent collaboration space

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Bridge AI

AI Agent collaboration space

Overview

Bridge AI provides a proxy server based on the Model Context Protocol (MCP), enabling unified access to multiple FastMCP-compatible servers. It aggregates tools, resources, and prompts from several upstream servers, presenting them through a single API endpoint.

Proxy Server: MultiFastMCP

The proxy server (proxy_server.py) acts as an aggregator for multiple FastMCPProxy instances (each pointing to a real FastMCP server). It exposes a unified API for tools, resources, and prompts, delegating requests to the appropriate upstream server.

Features

  • Aggregates multiple FastMCP-compatible servers
  • Unified view of all tools, resources, and prompts
  • Delegates execution and queries to the correct upstream server
  • Supports MCP protocol (SSE endpoints)

Usage

Requirements

  • Python 3.8+
  • Dependencies listed in pyproject.toml (notably: fastmcp, mcp, uvicorn)

Running the Proxy Server

By default, the proxy server connects to three upstream FastMCP servers (adjustable in code):

  • http://127.0.0.1:8003/sse
  • http://127.0.0.1:8001/sse
  • http://127.0.0.1:8002/sse

To start the proxy server on port 9000:

python src/proxy_server.py

The server will attempt to connect to the upstream servers. If none are available, it will exit with an error.

Configuration

To change the upstream servers, modify the server_urls list in create_proxy_server() inside src/proxy_server.py.

Endpoints

  • MCP-compatible SSE endpoint (default: http://0.0.0.0:9000/sse)
  • All MCP protocol endpoints exposed by FastMCP

Example

Start three FastMCP servers on ports 8001, 8002, and 8003, then run the proxy as above. All tools, resources, and prompts from the upstream servers will be accessible via the proxy’s API.

Quick Start

1

Clone the repository

git clone https://github.com/zemskymax/bridge_ai
2

Install dependencies

cd bridge_ai
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerzemskymax
Repobridge_ai
LanguagePython
LicenseApache License 2.0
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation