
mcp rag
一个基于 GroundX 和 OpenAI 的高效 RAG 实现,支持文档检索、语义搜索和类型安全配置。
Repository Info
About This Server
一个基于 GroundX 和 OpenAI 的高效 RAG 实现,支持文档检索、语义搜索和类型安全配置。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP-RAG: Model Context Protocol with RAG 🚀
A powerful and efficient RAG (Retrieval-Augmented Generation) implementation using GroundX and OpenAI, built with Modern Context Processing (MCP).
🌟 Features
- Advanced RAG Implementation: Utilizes GroundX for high-accuracy document retrieval
- Model Context Protocol: Seamless integration with MCP for enhanced context handling
- Type-Safe: Built with Pydantic for robust type checking and validation
- Flexible Configuration: Easy-to-customize settings through environment variables
- Document Ingestion: Support for PDF document ingestion and processing
- Intelligent Search: Semantic search capabilities with scoring
🛠️ Prerequisites
- Python 3.12 or higher
- OpenAI API key
- GroundX API key
- MCP CLI tools
📦 Installation
- Clone the repository:
git clone <repository-url>
cd mcp-rag
- Create and activate a virtual environment:
uv sync
source .venv/bin/activate # On Windows, use `.venv\Scripts\activate`
⚙️ Configuration
- Copy the example environment file:
cp .env.example .env
- Configure your environment variables in
.env:
GROUNDX_API_KEY="your-groundx-api-key"
OPENAI_API_KEY="your-openai-api-key"
BUCKET_ID="your-bucket-id"
🚀 Usage
Starting the Server
Run the inspect server using:
mcp dev server.py
Document Ingestion
To ingest new documents:
from server import ingest_documents
result = ingest_documents("path/to/your/document.pdf")
print(result)
Performing Searches
Basic search query:
from server import process_search_query
response = process_search_query("your search query here")
print(f"Query: {response.query}")
print(f"Score: {response.score}")
print(f"Result: {response.result}")
With custom configuration:
from server import process_search_query, SearchConfig
config = SearchConfig(
completion_model="gpt-4",
bucket_id="custom-bucket-id"
)
response = process_search_query("your query", config)
📚 Dependencies
groundx(≥2.3.0): Core RAG functionalityopenai(≥1.75.0): OpenAI API integrationmcp[cli](≥1.6.0): Modern Context Processing toolsipykernel(≥6.29.5): Jupyter notebook support
🔒 Security
- Never commit your
.envfile containing API keys - Use environment variables for all sensitive information
- Regularly rotate your API keys
- Monitor API usage for any unauthorized access
🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Quick Start
Clone the repository
git clone https://github.com/sourangshupal/mcp-ragInstall dependencies
cd mcp-rag
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.