
grafana ai agent
This project demonstrates a proof of concept for building an AI agent using the Semantic Kernel Python SDK that interfaces with Grafana through a local MCP server.
Repository Info
About This Server
This project demonstrates a proof of concept for building an AI agent using the Semantic Kernel Python SDK that interfaces with Grafana through a local MCP server.
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Grafana Agent PoC with Semantic Kernel
This project demonstrates a proof of concept for building an AI agent using the Semantic Kernel Python SDK that interfaces with Grafana through a local MCP server.
Prerequisites
- Python 3.8+
- Docker and Docker Compose
- Ollama
Setup
- Create a virtual environment and activate it:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
-
Configure environment variables:
- Copy the .env.example file to create a new .env file:
cp .env.example .env- Update the .env file with your credentials:
- AZURE_OPENAI_DEPLOYMENT_NAME: Your Azure OpenAI deployment name
- AZURE_OPENAI_API_VERSION: API version (e.g., 2024-12-01-preview)
- AZURE_OPENAI_API_KEY: Your Azure OpenAI API key
- AZURE_OPENAI_BASE_URL: Your Azure OpenAI base URL
- GRAFANA_MCP_URL: Local MCP server URL (default: http://localhost:8000/sse)
- GRAFANA_URL: Grafana instance URL (default: http://localhost:3000)
- GRAFANA_ADMIN_USERNAME: Grafana admin username
- GRAFANA_ADMIN_PASSWORD: Grafana admin password
- GRAFANA_API_KEY: Your Grafana API key (create one in Grafana UI)
-
Start Grafana using Docker Compose:
cd docker
docker-compose up -d
- Configure Grafana:
- Access Grafana at http://localhost:3000
- Log in with the credentials set in your .env file
- Go to Configuration -> API Keys
- Create a new API key with Admin role
- Copy the generated key to your .env file's GRAFANA_API_KEY
Running the Applications
The project provides three ways to interact with Grafana:
-
Basic Console Interface (src/console.py):
python src/console.pyBasic chat based console interface powered by llm and Semantic Kernel.
-
Interactive Console Agent (src/console_agent.py):
python src/console_agent.pyAgent based console interface powered by llm and Semantic Kernel, allowing for more complex interactions.
-
REST API Server (src/main.py):
PYTHONPATH=. uvicorn src.main:app --reload --host 0.0.0.0 --port 8080 --log-level debugAccess the API at http://localhost:8080
- REST API documentation: http://localhost:8080/docs
- Health check: http://localhost:8080/health
API Endpoints
Health Check
GET /health
Query the Agent
POST /query
Content-Type: application/json
{
"query": "Your natural language query here"
}
Grafana Access
Grafana UI is available at http://localhost:3000
- Use the credentials specified in your .env file
Example Usage
Here are some example queries you can send to the agent:
- Create a new dashboard:
{
"query": "Create a new dashboard called 'System Metrics'"
}
- Get dashboard information:
{
"query": "Show me the details of dashboard with UID xyz123"
}
- Update a dashboard:
{
"query": "Update the System Metrics dashboard to include a CPU usage panel"
}
Project Structure
graphana-agent-poc/
├── docker/
│ └── docker-compose.yml # Grafana container configuration
├── src/
│ ├── agent/
│ │ ├── plugins/
│ │ │ └── grafana_plugin.py # Semantic Kernel plugin for Grafana
│ │ └── skills/
│ │ └── grafana_skills.py # Implementation of Grafana operations
│ └── main.py # FastAPI application and agent setup
├── requirements.txt # Python dependencies
└── README.md # This file
Quick Start
Clone the repository
git clone https://github.com/benbakhar/grafana-ai-agentInstall dependencies
cd grafana-ai-agent
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.