
mcp examples
提供使用 Model Context Protocol (MCP) 的多种框架示例,基于开源 LLM 和 Cloudera AI 推理服务。
Repository Info
About This Server
提供使用 Model Context Protocol (MCP) 的多种框架示例,基于开源 LLM 和 Cloudera AI 推理服务。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Examples
This repository contains minimal, end-to-end examples for utilizing Model Context Protocol (MCP) in a variety of frameworks—LangChain, CrewAI, OpenAI Agent SDK, LlamaIndex, Microsoft AutoGen—backed by open-source LLMs hosted on Cloudera AI Inference service.
MCP is everywhere these days. This project is a hands‑on way to test MCP and assess its adoption across the AI ecosystem.
Components
MCP Server
The examples use a standalone Cloudera Iceberg MCP server to provide data as context for LLMs. This server connects to sample Apache Iceberg tables through Impala, runs as an independent process, and exposes the Server‑Sent Events (SSE) endpoint.
See mcp-server-sse/ for the standalone SSE‑based Iceberg server.
Model
The examples connect to Cloudera AI Inference service, Cloudera’s model‑serving platform that offers OpenAI‑compatible endpoints with end‑to‑end privacy for both public‑cloud and on‑premises deployments. Built on KServe, it can serve any Hugging Face open‑source model as well as NVIDIA‑optimized NIM containers for maximum performance.
In these examples, we call the meta/llama-3.1-8b-instruct NVIDIA NIM (hosted by Cloudera AI Inference Service) to drive our LLM workloads.
Frameworks
Each framework directory is a standalone UV project and includes its own README with full instructions. The example scripts in each directory will:
- Load your .env settings (for connection and auth details for the Cloudera AI Inference model and Impala endpoints).
- Connect to the MCP server over SSE.
- Initialize the OpenAI compatible client to point at Cloudera AI Inference.
- Spin up an Agent that listens for context updates.
- Send a sample query.
- Route context via SSE and LLM calls through Cloudera AI.
- Output the Agent’s answer to your console.
| Framework | Location |
|---|---|
| CrewAI | framework-crewai-example |
| OpenAI Agent SDK | framework-openai-agent-sdk-example |
| LangChain / LangGraph | framework-langchain-example |
| LlamaIndex | framework-llamaindex-example |
| Microsoft Autogen (WIP) | framework-autogen-example |
Quick Start
Clone the repository
git clone https://github.com/peterableda/mcp-examplesInstall dependencies
cd mcp-examples
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.