
mcp_sample
一个基于 Python 的 MCP 示例项目,包含 LLM 服务器启动、模型下载和 UI 配置。
Repository Info
About This Server
一个基于 Python 的 MCP 示例项目,包含 LLM 服务器启动、模型下载和 UI 配置。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
readme
start
llm用のサーバー起動
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
モデルのダウンロード
docker exec -it ollama ollama run gemma3:1b
uv インストール
pip install uv
uv sync
mcpの起動
uvx mcpo --port 8080 --host 0.0.0.0 --config mcp_config.json
UIの起動
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
Quick Start
Clone the repository
git clone https://github.com/if001/mcp_sampleInstall dependencies
cd mcp_sample
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.