
orderli_recursive_mcp
MCP + Claude Recursive Calls Package
Repository Info
About This Server
MCP + Claude Recursive Calls Package
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Agent Package
A Python package for interacting with LLMs via the MCP protocol.
Directory Structure
mcp_agent_package/
├── orderli_recursive_mcp/ # Main package directory
│ ├── __init__.py # Package exports
│ ├── client.py # Main client functionality
│ ├── config.py # Configuration settings
│ ├── utils.py # Utility functions
│ └── README.md # Package documentation
├── main.py # Example script to test the package
├── check_server.py # Utility to check server connection
├── setup.py # Package installation config
└── .gitignore # Git ignore file
Installation
From within this directory, run:
pip install -e .
Usage
Basic Import and Usage
import asyncio
from orderli_recursive_mcp import call_llm
# Define a response handler to process streamed responses
async def response_handler(text_chunk: str, event_type: str) -> None:
"""Handle different types of response events."""
if event_type == "text":
# Main LLM response text
print(text_chunk, end="")
elif event_type == "tool_call":
# Tool being called
print(f"\n[TOOL CALL] {text_chunk}")
elif event_type == "tool_result":
# Result from tool execution
print(f"[RESULT] {text_chunk}")
elif event_type == "info":
# Informational messages
print(f"\n[INFO] {text_chunk}")
elif event_type in ("error", "warning"):
# Error and warning messages
print(f"\n[{event_type.upper()}] {text_chunk}")
async def main():
# User prompt/question
prompt = "What is the meaning of life?"
# Call the LLM with the prompt
session = await call_llm(
prompt=prompt,
response_callback=response_handler,
# Optional parameters:
# server_url="http://localhost:8001/sse", # Custom MCP server URL
# api_key="your_anthropic_api_key", # Custom API key
# disable_console_output=True, # Disable internal console output
# debug=False # Debug the MCP server connection
)
# You can continue the conversation with the same session
# await session.chat("Tell me more about that.")
if __name__ == "__main__":
asyncio.run(main())
Advanced Usage with Custom Tool Calls
To view an example, run:
python test.py
To check if the MCP server is running properly:
python check_server.py
Requirements
- Anthropic API key (can be set as environment variable
ANTHROPIC_API_KEYor passed directly) - Running MCP server using SSE
- Python 3.8 or higher
Configuration
The package uses the following environment variables:
ANTHROPIC_API_KEY: Your Anthropic API key (required)ANTHROPIC_MODEL: Anthropic model to use (default: "claude-3-5-sonnet-latest")MCP_SERVER_URL: URL of the MCP server (default:http://localhost:8001/sse)MCP_MAX_RECURSIONS: Maximum number of tool use recursions (default: 15)MCP_MAX_RETRIES: Maximum number of connection retries (default: 3)MCP_RETRY_DELAY: Delay between connection retries in seconds (default: 2.0)MCP_TOOL_TIMEOUT: Tool execution timeout in seconds (default: 30.0)
Troubleshooting
If you encounter issues with tool calls or server connections:
1. Check Server Connection
Run the server connection check utility:
python check_server.py
This will verify that:
- The MCP server is reachable
- The server has the necessary tools available
- Basic tool execution works correctly
2. Test Individual Tools
If specific tools are failing, you can test them directly using the test tool utility:
python test_tool.py <tool_name> --args '<json_args>'
3. Common Issues with Database Tools
Database tools like list_schemas will fail if:
- The database connection is not properly configured on the server
- Database credentials are incorrect
- The database service is not running
- There are permission issues for the database user
4. Debugging Tool Call Failures
When a tool call fails, check the following:
- Server Logs: Look at the MCP server logs for detailed error messages
- Tool Arguments: Make sure you're providing all required arguments with the correct names
- Connection Issues: Verify that the connection to the MCP server is stable
- API Key: Confirm your Anthropic API key is valid and has sufficient quota
Quick Start
Clone the repository
git clone https://github.com/grantr-code/orderli_recursive_mcpInstall dependencies
cd orderli_recursive_mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.