
openai assistant mcp
提供与 OpenAI Assistant API 集成的 MCP 服务器,支持对话管理、工具集成等功能。
Repository Info
About This Server
提供与 OpenAI Assistant API 集成的 MCP 服务器,支持对话管理、工具集成等功能。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
OpenAI Assistant MCP Server
MCP server for OpenAI Assistant API. Provides tools for managing assistants, threads, messages, and runs.
Installation
# Clone the repository
git clone <repo-url>
cd openai-assistant-mcp
# Install dependencies
uv pip install -e .
Configuration
Set your OpenAI API key:
export OPENAI_API_KEY=your_api_key_here
Or create a .env file:
OPENAI_API_KEY=your_api_key_here
Running the Server
Option 1: Direct Python execution
uv run mcp run run_server.py
Option 2: MCP Dev Mode (for development)
uv run mcp dev run_server.py
Option 3: MCPO with FastAPI UI (recommended)
Due to a bug in the official mcpo release, use the fixed fork:
uvx --from git+https://github.com/bmen25124/mcpo.git@fix_schema_defs_not_found mcpo --port 8602 -- uv run mcp run run_server.py
After starting, access the FastAPI UI at: http://localhost:8602/docs
Available Tools
Assistant Management
create_assistant- Create a new assistantget_assistant- Retrieve assistant by IDlist_assistants- List all assistantsmodify_assistant- Update assistant configurationdelete_assistant- Delete an assistant
Thread Management
create_thread- Create a conversation threadget_thread- Retrieve thread by IDmodify_thread- Update thread metadatadelete_thread- Delete a thread
Message Management
create_message- Add message to threadget_message- Retrieve message by IDlist_messages- List thread messagesmodify_message- Update message metadatadelete_message- Delete a message
Run Management
create_run- Start assistant executioncreate_thread_and_run- Create thread and run in one calllist_runs- List thread runsget_run- Retrieve run by IDmodify_run- Update run metadatasubmit_tool_outputs- Submit tool call resultscancel_run- Cancel active run
Run Steps
list_run_steps- List steps for a runget_run_step- Retrieve specific step
Example Usage
# Create an assistant
assistant = await create_assistant({
"model": "gpt-4-turbo-preview",
"name": "Code Helper",
"instructions": "You are a helpful coding assistant."
})
# Create a thread
thread = await create_thread()
# Add a message
message = await create_message({
"thread_id": thread["id"],
"role": "user",
"content": "Help me write a Python function"
})
# Run the assistant
run = await create_run({
"thread_id": thread["id"],
"assistant_id": assistant["id"]
})
Project Structure
├── src/
│ ├── server.py # MCP server entry point
│ ├── config/ # Configuration
│ └── tools/ # Tool implementations
│ ├── assistant/ # Assistant tools
│ ├── threads/ # Thread tools
│ ├── messages/ # Message tools
│ ├── runs/ # Run tools
│ └── run_steps/ # Run step tools
├── tests/ # Test suite
├── docs/ # Documentation
└── pyproject.toml # Project configuration
Troubleshooting
MCPO Schema Error
If you encounter TypeError: argument of type 'NoneType' is not iterable, you're using the broken official mcpo release. Use the fixed fork command shown above.
API Key Issues
Ensure your OpenAI API key has access to the Assistants API beta.
License
MIT
Quick Start
Clone the repository
git clone https://github.com/xrouter-chat/openai-assistant-mcpInstall dependencies
cd openai-assistant-mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.