
infra pilot
Infra pilot Agentic AI
Repository Info
About This Server
Infra pilot Agentic AI
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Infra Pilot: Terraform MCP Server
Infra Pilot is an MCP-compliant, agentic DevOps copilot that understands and answers questions about your Terraform infrastructure code using a local LLM (LLaMA via Ollama), LangGraph agent flow, and vector-based RAG.
Think of it as a ChatGPT-like assistant that knows your actual Terraform setup — and can generate documentation, answer questions, and be extended to detect drift, security issues, or cost spikes.
Features
- Local LLM inference (LLaMA 3 via Ollama — no OpenAI key required)
- LangGraph-powered multi-step agent with structured context
- Embeds
.tffiles and answers natural language questions - Auto-generates Markdown documentation from code
- Streamlit UI for easy querying
- FastAPI backend with
/askand/docs/generate
Project Structure
infra-pilot/
├── app/
│ ├── main.py # FastAPI app
│ ├── routes/ # /ask and /docs endpoints
│ ├── services/ # Embedding + inference logic
│ ├── agents/ # LangGraph reasoning flow
│ ├── templates/ # Structured prompt template
│ └── ui/app.py # Streamlit UI
├── infra/ # Your Terraform code (.tf files)
├── chroma_store/ # Vector DB store (auto-generated)
├── docs_output/ # Markdown docs (generated)
├── embed_terraform.py # Script to re-embed .tf files
├── requirements.txt
├── setup.sh # Quick start setup script
└── README.md
How It Works (MCP Pipeline)
- Code context is embedded with HuggingFace + Chroma
- A LangGraph agent uses a structured prompt template:
## Code: {code_chunks}
## Vars: {tf_vars}
## Question: {user_query}
- LLaMA model (via Ollama) generates answers
- Markdown docs are generated with the same reasoning chain
Setup Instructions
1. Run the Setup Script
chmod +x setup.sh
./setup.sh
This will:
- Create a virtual environment
- Install all dependencies
- Prompt you to run embedding and start the app
2. Or Follow Manual Setup
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
ollama run llama3
python embed_terraform.py
uvicorn app.main:app --reload
streamlit run app/ui/app.py
Example Questions (Try These in Streamlit)
- "What does this module create?"
- "What IAM permissions are given to Lambda?"
- "Is the RDS database publicly accessible?"
- "List the input variables and their defaults."
Generate Docs
Click " Generate Markdown Documentation" in the Streamlit UI, or call it directly:
curl -X POST http://localhost:8000/docs/generate
Output saved to:
docs_output/infra-doc-YYYYMMDD-HHMMSS.md
Roadmap Ideas
- Slack bot interface:
/ask-infra - Drift detection with AWS SDK / TF state
- Cost estimation with Infracost
- Test mode for mocking plan/apply
- Auto-export docs to GitHub/Confluence
Contributing
Want to build a plugin, add an agent tool, or enhance UI? PRs and ideas welcome — let’s build InfraPilot into the ultimate infra copilot.
Questions / Feedback
Open an issue or ping @siva if you need help, extensions, or a demo setup!
Quick Start
Clone the repository
git clone https://github.com/sivabhimireddy/infra-pilotInstall dependencies
cd infra-pilot
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.