peterableda
MCP Serverpeterabledapublic

mcp examples

提供使用 Model Context Protocol (MCP) 的多种框架示例,基于开源 LLM 和 Cloudera AI 推理服务。

Repository Info

1
Stars
0
Forks
1
Watchers
0
Issues
Python
Language
Apache License 2.0
License

About This Server

提供使用 Model Context Protocol (MCP) 的多种框架示例,基于开源 LLM 和 Cloudera AI 推理服务。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

MCP Examples

This repository contains minimal, end-to-end examples for utilizing Model Context Protocol (MCP) in a variety of frameworks—LangChain, CrewAI, OpenAI Agent SDK, LlamaIndex, Microsoft AutoGen—backed by open-source LLMs hosted on Cloudera AI Inference service.

MCP is everywhere these days. This project is a hands‑on way to test MCP and assess its adoption across the AI ecosystem.

Components

MCP Server

The examples use a standalone Cloudera Iceberg MCP server to provide data as context for LLMs. This server connects to sample Apache Iceberg tables through Impala, runs as an independent process, and exposes the Server‑Sent Events (SSE) endpoint.

See mcp-server-sse/ for the standalone SSE‑based Iceberg server.

Model

The examples connect to Cloudera AI Inference service, Cloudera’s model‑serving platform that offers OpenAI‑compatible endpoints with end‑to‑end privacy for both public‑cloud and on‑premises deployments. Built on KServe, it can serve any Hugging Face open‑source model as well as NVIDIA‑optimized NIM containers for maximum performance.

In these examples, we call the meta/llama-3.1-8b-instruct NVIDIA NIM (hosted by Cloudera AI Inference Service) to drive our LLM workloads.

Frameworks

Each framework directory is a standalone UV project and includes its own README with full instructions. The example scripts in each directory will:

  1. Load your .env settings (for connection and auth details for the Cloudera AI Inference model and Impala endpoints).
  2. Connect to the MCP server over SSE.
  3. Initialize the OpenAI compatible client to point at Cloudera AI Inference.
  4. Spin up an Agent that listens for context updates.
  5. Send a sample query.
  6. Route context via SSE and LLM calls through Cloudera AI.
  7. Output the Agent’s answer to your console.
FrameworkLocation
CrewAIframework-crewai-example
OpenAI Agent SDKframework-openai-agent-sdk-example
LangChain / LangGraphframework-langchain-example
LlamaIndexframework-llamaindex-example
Microsoft Autogen (WIP)framework-autogen-example

Quick Start

1

Clone the repository

git clone https://github.com/peterableda/mcp-examples
2

Install dependencies

cd mcp-examples
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerpeterableda
Repomcp-examples
LanguagePython
LicenseApache License 2.0
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation