
agent proxmox
langchain-based client for proxmox mcp server - co-pilot experiment
Repository Info
About This Server
langchain-based client for proxmox mcp server - co-pilot experiment
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Proxmox LangChain Agent
A lightweight FastAPI service that provides a chat endpoint backed by LangChain and Redis-based message history. This service uses an Ollama LLM for responses and persists conversations in Redis. It is designed to interact with a Proxmox MCP (Model Context Protocol) server, allowing conversational access to Proxmox cluster and VM information.
Repository Structure
agent-proxmox/
├── Dockerfile
├── app.py
├── requirements.txt
└── README.md
- Dockerfile: Builds a slim Python image with the application and dependencies.
- app.py: FastAPI application defining a
/chatendpoint that runs a LangChain agent with Proxmox-specific tools and Redis-backed history. - requirements.txt: Python dependencies needed to run the agent.
Prerequisites
- Docker Engine (20.x+)
- Redis instance accessible by the agent
- Proxmox MCP server accessible by the agent (see below)
- (Optional) Docker Compose if integrating into a larger stack
Proxmox MCP Server Integration
This agent is designed to work with a Proxmox MCP (Model Context Protocol) server. The MCP server exposes HTTP endpoints for cluster and VM information, which the agent accesses using custom LangChain tools:
get_vm_list: Calls the MCP endpoint/mcp/context/vmsto retrieve a list of all VMs and their statuses.get_cluster_info: Calls the MCP endpoint/mcp/context/clusterto retrieve high-level Proxmox cluster status info.
You must have a running MCP server accessible at the address configured in app.py (default: http://mcp:8008).
Environment Variables
| Variable | Default | Description |
|---|---|---|
REDIS_HOST | localhost | Hostname or IP of your Redis server |
REDIS_PORT | 6379 | Port number of your Redis server |
REDIS_PASSWORD | none | Password for Redis (required if Redis is secured) |
LANGCHAIN_LOG_DIR | ./logs | Directory path (inside container) for logs |
Make sure to set REDIS_PASSWORD if your Redis instance requires authentication.
Building the Docker Image
From within the project directory:
docker build -t agent-proxmox .
Running the Container
A minimal docker run example:
docker run -d \
--name agent-proxmox \
-p 8501:8501 \
-e REDIS_HOST=ai-redis \
-e REDIS_PORT=6379 \
-e REDIS_PASSWORD=$REDIS_PASSWORD \
-e LANGCHAIN_LOG_DIR=/logs \
-v $(pwd)/logs:/logs \
agent-proxmox
This exposes the FastAPI app on port 8501 and mounts a local logs/ directory for persistent logging.
Integrating with Docker Compose
If you have a larger Docker Compose setup, add this service:
services:
agent-proxmox:
build: ./agent-proxmox
container_name: agent-proxmox
networks:
- ai-stack
depends_on:
- ai-redis
- mcp
environment:
- REDIS_HOST=ai-redis
- REDIS_PORT=6379
- REDIS_PASSWORD=${REDIS_PASSWORD}
- LANGCHAIN_LOG_DIR=${LANGCHAIN_LOG_DIR}
volumes:
- ./agent-proxmox/logs:${LANGCHAIN_LOG_DIR}
ports:
- "8501:8501"
API Usage
Send a POST request to /chat with a JSON body { "message": "Your question here" }:
curl -X POST http://localhost:8501/chat \
-H "Content-Type: application/json" \
-d '{"message":"Show me all VMs in the cluster"}'
Response:
{ "response": "<LLM reply>" }
Logs
Conversation logs are written to langchain_YYYYMMDD.log in the LANGCHAIN_LOG_DIR directory. Adjust verbosity by modifying the logging.basicConfig settings in app.py.
CI/CD Pipeline (GitLab)
This project uses a GitLab CI/CD pipeline to automate validation, linting, building, and deployment of the Docker container to the GitLab Container Registry. The pipeline is defined in .gitlab-ci.yml and includes the following stages:
- validate: Installs dependencies and checks Python syntax.
- lint: Runs flake8 to enforce code style and quality.
- deploy: Builds and pushes the Docker image to the registry (only on
mainbranch and tags).
The pipeline uses GitLab's built-in CI/CD variables for authentication and registry information. No additional variables are required for standard operation.
FastAPI Application Overview
- The main application is in
app.pyand uses FastAPI to provide a/chatendpoint. - The endpoint expects a POST request with a JSON body:
{ "message": "Your question here" }. - The backend uses LangChain's agent framework with two custom tools for Proxmox MCP integration.
- Conversation history is stored in Redis using
langchain_community.chat_message_histories.RedisChatMessageHistory. - The LLM is provided by an Ollama server (default:
http://ollama:11434). - Logging is configured to write conversation logs to the directory specified by
LANGCHAIN_LOG_DIR(default:./logs).
Developer Notes
- Code Style: The project enforces PEP8 compliance using flake8. Ensure your code passes
flake8 app.pybefore committing. - Environment Variables: See the table above. You must set
REDIS_PASSWORDif your Redis instance is secured. - Testing the API: Use the provided
curlexample or a tool like Postman to interact with the/chatendpoint. - Extending the Agent: To add new tools, define a new function and add it to the
toolslist inapp.pyusing theToolclass from LangChain. - Pipeline Troubleshooting: If the Docker image fails to push, ensure you are pushing from the
mainbranch or a tag, as the pipeline only deploys in those cases. - Logs: All conversations are logged with timestamps. Check the
logs/directory for daily log files.
Contributing
- Fork the repository and create a feature branch.
- Ensure your code passes linting and validation locally:
pip install -r requirements.txt flake8 app.py python -m py_compile app.py - Push your branch and create a merge request.
For questions or issues, please open an issue in the GitLab repository.
Happy chaining!
Quick Start
Clone the repository
git clone https://github.com/johnstetter/agent-proxmoxInstall dependencies
cd agent-proxmox
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.