
mcp template
连接本地大语言模型与真实世界工具、知识库和API,通过MCP协议实现自动化工具调用。
Repository Info
About This Server
连接本地大语言模型与真实世界工具、知识库和API,通过MCP协议实现自动化工具调用。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
🧠 LLM Tool-Calling Assistant with MCP Integration
Connect your local LLM to real-world tools, knowledge bases, and APIs via MCP.
This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
📦 Features
- 🔧 Tool execution through MCP server
- 🧠 Local LLM integration via HTTP or OpenAI SDK
- 📚 Knowledge base support (
data.json) - ⚡ Supports
stdioandssetransports
🗂 Project Files
| File | Description |
|---|---|
server.py | Registers tools and starts MCP server |
client-http.py | Uses aiohttp to communicate with local LLM |
clientopenai.py | Uses OpenAI-compatible SDK for LLM + tool call logic |
client-stdio.py | MCP client using stdio |
client-see.py | MCP client using SSE |
data.json | Q&A knowledge base |
📥 Installation
Requirements
Python 3.8+
Install dependencies:
pip install -r requirements.txt
requirements.txt
aiohttp==3.11.18
nest_asyncio==1.6.0
python-dotenv==1.1.0
openai==1.77.0
mcp==1.6.0
🚀 Getting Started
1. Run the MCP server
python server.py
This launches your tool server with functions like add, multiply, and get_knowledge_base.
2. Start a client
Option A: HTTP client (local LLM via raw API)
python client-http.py
Option B: OpenAI SDK client
python client-openai.py
Option C: stdio transport
python client-stdio.py
Option D: SSE transport
Make sure server.py sets:
transport = "sse"
Then run:
python client-sse.py
💬 Example Prompts
Math Tool Call
What is 8 times 3?
Response:
Eight times three is 24.
Knowledge Base Question
What are the healthcare benefits available to employees in Singapore?
Response will include the relevant answer from data.json.
📁 Example: data.json
[
{
"question": "What is Singapore's public holiday schedule?",
"answer": "Singapore observes several public holidays..."
},
{
"question": "How do I apply for permanent residency in Singapore?",
"answer": "Submit an online application via the ICA website..."
}
]
🔧 Configuration
Inside client-http.py or clientopenai.py, update the following:
LOCAL_LLM_URL = "..."
TOKEN = "your-api-token"
LOCAL_LLM_MODEL = "your-model"
Make sure your LLM is serving OpenAI-compatible API endpoints.
🧹 Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.
🪪 License
MIT License. See LICENSE file.
Quick Start
Clone the repository
git clone https://github.com/o6-webwork/mcp-templateInstall dependencies
cd mcp-template
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.