
samtyzukki
一个简化版的聊天界面,用于与Python MCP服务器交互。
Repository Info
About This Server
一个简化版的聊天界面,用于与Python MCP服务器交互。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
BROKKOLY MCP Server Debugger
A simplified chat interface for interacting with Python MCP (Model Context Protocol) servers in-process.
Overview
BROKKOLY is a simple chat interface designed specifically for Python MCP servers. It allows you to:
- Run MCP servers in-process using threads
- Simulate stdin/stdout communication using memory streams
- Chat with an LLM agent (powered by LangChain and OpenAI) that can interact with MCP servers
Features
- In-process execution: Run MCP servers in the same process as the chat interface
- Memory streams: Simulate stdin/stdout communication using memory streams
- LLM agent: Chat with an LLM agent that can interact with MCP servers
- Automatic server management: Automatically start all MCP servers at initialization
- Message handling: Send and receive messages to/from MCP servers through the LLM
- Configuration: Configure MCP servers using a JSON configuration file
Installation
Prerequisites
- Python 3.9 or higher
- OpenAI API key (for the LLM agent)
Installation Steps
- Clone the repository:
git clone https://github.com/yourusername/brokkoly.git
cd brokkoly
- Install the package:
pip install -e .
- Create a
.envfile with your OpenAI API key:
echo "OPENAI_API_KEY=your-api-key" > .env
Usage
Configuration
Create a mcp.json file with your MCP server configurations:
{
"mcpServers": {
"example": {
"command": "python",
"args": [
"/path/to/mcp_server.py",
"--mode",
"stdio"
],
"env": {
"EXAMPLE_VAR": "example_value"
}
}
}
}
Running the Debugger
Run the debugger:
python main.py
Or, if you installed the package:
brokkoly
Command-line Arguments
--config: Path to the MCP configuration file--log-level: Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)--log-file: Path to the log file--env-file: Path to the .env file--verbose: Enable verbose output
Chat Interface
The chat interface is a simple text-based interface that allows you to:
- Chat with the LLM agent
- Type 'exit' or 'quit' to exit the application
- The LLM agent can interact with MCP servers using tools
Development
Project Structure
brokkoly/: Main package directoryconfig/: Configuration modulememory_streams/: Memory streams moduleserver_manager/: Server manager modulellm_agent/: LLM agent moduleutils/: Utility functions
tests/: Test directorymain.py: Entry pointsetup.py: Setup script
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgements
- LangChain for the LLM agent
- OpenAI for the LLM API
- Pydantic for the configuration models
- AnyIO for the memory streams
Quick Start
Clone the repository
git clone https://github.com/enremmeta/samtyzukkiInstall dependencies
cd samtyzukki
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.