
mcpx py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
Repository Info
About This Server
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
mcpx-py
A Python library for interacting with LLMs using mcp.run tools
Features
AI Provider Support
mcpx-py supports all models supported by PydanticAI
Dependencies
uvnpmollama(optional)
mcp.run Setup
You will need to get an mcp.run session ID by running:
npx --yes -p @dylibso/mcpx gen-session --write
This will generate a new session and write the session ID to a configuration file that can be used
by mcpx-py.
If you need to store the session ID in an environment variable you can run gen-session
without the --write flag:
npx --yes -p @dylibso/mcpx gen-session
which should output something like:
Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Then set the MPC_RUN_SESSION_ID environment variable:
$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Python Usage
Installation
Using uv:
uv add mcpx-py
Or pip:
pip install mcpx-py
Example code
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
# Or OpenAI
# llm = Chat("gpt-4o")
# Or Ollama
# llm = Chat("ollama:qwen2.5")
# Or Gemini
# llm = Chat("gemini-2.0-flash")
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
It's also possible to get structured output by setting result_type
from mcpx_py import Chat, BaseModel, Field
from typing import List
class Summary(BaseModel):
"""
A summary of some longer text
"""
source: str = Field("The source of the original_text")
original_text: str = Field("The original text to be summarized")
items: List[str] = Field("A list of summary points")
llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
More examples can be found in the examples/ directory
Command Line Usage
Installation
uv tool install mcpx-py
From git:
uv tool install git+https://github.com/dylibso/mcpx-py
Or from the root of the repo:
uv tool install .
uvx
mcpx-client can also be executed without being installed using uvx:
uvx --from mcpx-py mcpx-client
Or from git:
uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client
Running
Get usage/help
mcpx-client --help
Chat with an LLM
mcpx-client chat
List tools
mcpx-client list
Call a tool
mcpx-client tool eval-js '{"code": "2+2"}'
LLM Configuration
Provider Setup
Claude
- Sign up for an Anthropic API account at https://console.anthropic.com
- Get your API key from the console
- Set the environment variable:
ANTHROPIC_API_KEY=your_key_here
OpenAI
- Create an OpenAI account at https://platform.openai.com
- Generate an API key in your account settings
- Set the environment variable:
OPENAI_API_KEY=your_key_here
Gemini
- Create an Gemini account at https://aistudio.google.com
- Generate an API key in your account settings
- Set the environment variable:
GEMINI_API_KEY=your_key_here
Ollama
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull llama3.2 - No API key needed - runs locally
Llamafile
- Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
- Make the file executable:
chmod +x your-model.llamafile - Run in JSON API mode:
./your-model.llamafile --json-api --host 127.0.0.1 --port 8080 - Use with the OpenAI provider pointing to
http://localhost:8080
Quick Start
Clone the repository
git clone https://github.com/dylibso/mcpx-pyInstall dependencies
cd mcpx-py
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.