
mcp client server example
展示本地AI代理如何理解用户查询并自动调用Python函数。
Repository Info
About This Server
展示本地AI代理如何理解用户查询并自动调用Python函数。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP + Ollama Local Tool Calling Example
This project demonstrates how a local AI agent can understand user queries and automatically call Python functions using:
- Model Context Protocol (MCP)
- Ollama for running a local LLM (e.g., Llama3)
- Python MCP Client and Server
🔗 Sequence Diagram
sequenceDiagram
participant User
participant MCP_Client
participant Ollama_LLM
participant MCP_Server
User->>MCP_Client: 1) User types: "What is 5 + 8?"
MCP_Client->>Ollama_LLM: 2) Send available tools + user query
Ollama_LLM->>Ollama_LLM: 3) Understand query & tool descriptions
Ollama_LLM->>Ollama_LLM: 4) Select tool: add(a=5, b=8)
Ollama_LLM->>MCP_Client: 5) Return tool_call
MCP_Client->>MCP_Server: 6) Execute add(a=5, b=8)
MCP_Server-->>MCP_Client: 7) Return result: 13
MCP_Client-->>User: 8) Show final answer: 13
📚 Project Structure
.
├── math_server.py # MCP Server exposing add() and multiply() tools
├── ollama_client.py # MCP Client interacting with Ollama
├── README.md # Project documentation
🛠️ Setup Instructions
1. Install Requirements
pip install "mcp[cli] @ git+https://github.com/awslabs/mcp.git" openai==0.28 httpx
Make sure you have Ollama installed and running.
2. Pull or run an LLM model
ollama run llama3
(Ensure the model you run supports tool calling.)
3. Run the MCP Server
python math_server.py
The server exposes two simple tools:
add(a: int, b: int) -> intmultiply(a: int, b: int) -> int
4. Run the MCP Client
python ollama_client.py math_server.py
5. Interact!
Example queries:
Query: What is 5 + 8?
Response: 13
Query: Multiply 7 and 9
Response: 63
The MCP client sends the query and available tools to Ollama. The LLM internally decides which tool to use based on the tool descriptions and user intent.
🚀 How It Works
- MCP Client lists available tools.
- Sends tools + user query to Ollama LLM.
- LLM reasons about the best matching tool.
- LLM generates a
tool_call. - MCP Client invokes the function via the MCP Server.
- Final result is returned and displayed.
✅ No manual hardcoding! ✅ Everything runs locally! ✅ Fully autonomous!
📢 Why This Matters
This pattern enables building smart local AI agents that:
- Understand user intent
- Dynamically select the correct actions
- Operate fully offline and locally
It opens doors for:
- Autonomous developers
- Local intelligent assistants
- Secure AI workflows
🏷️ Hashtags for Sharing
#MCP #ModelContextProtocol #Ollama #LocalLLM #FunctionCalling #Python #AI #DeveloperTools #AIEngineering #AutonomousAgents
🙌 Credits
- Model Context Protocol
- Ollama
"Smarter AI agents start with understanding how they think!"
Next Steps: Add Streamlit UI or Dockerize this project 🚀
Quick Start
Clone the repository
git clone https://github.com/rajeevchandra/mcp-client-server-exampleInstall dependencies
cd mcp-client-server-example
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.