
mcp server
未知
Repository Info
About This Server
未知
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Gemini Server
A server implementation of the Model Context Protocol (MCP) to enable AI assistants like Claude to interact with Google's Gemini API.
Project Overview
This project implements a server that follows the Model Context Protocol, allowing AI assistants to communicate with Google's Gemini models. With this MCP server, AI assistants can request text generation, text analysis, and maintain chat conversations through the Gemini API.
Features
- Client-Server Communication: Implements MCP protocol for secure message exchange between client and server.
- Message Processing: Handles and processes client requests, sending appropriate responses.
- Error Handling & Logging: Logs server activities and ensures smooth error recovery.
- Environment Variables Support: Uses
.envfile for storing sensitive information securely. - API Testing & Debugging: Supports manual and automated testing using Postman and test scripts.
Installation
Prerequisites
- Python 3.7 or higher
- Google AI API key
Setup
- Clone this repository:
git clone https://github.com/yourusername/mcp-gemini-server.git
cd mcp-gemini-server
- Create a virtual environment:
python -m venv venv
-
Activate the virtual environment:
- Windows:
venv\Scripts\activate - macOS/Linux:
source venv/bin/activate
- Windows:
-
Install dependencies:
pip install -r requirements.txt
- Create a
.envfile in the root directory with your Gemini API key:
GEMINI_API_KEY=your_api_key_here
Usage
- Start the server:
python server.py
-
The server will run on
http://localhost:5000/by default -
Send MCP requests to the
/mcpendpoint using POST method
Example Request
import requests
url = 'http://localhost:5000/mcp'
payload = {
'action': 'generate_text',
'parameters': {
'prompt': 'Write a short poem about AI',
'temperature': 0.7
}
}
response = requests.post(url, json=payload)
print(response.json())
API Reference
Endpoints
GET /health: Check if the server is runningGET /list-models: List available Gemini modelsPOST /mcp: Main endpoint for MCP requests
MCP Actions
1. generate_text
Generate text content with Gemini.
Parameters:
prompt(required): The text prompt for generationtemperature(optional): Controls randomness (0.0 to 1.0)max_tokens(optional): Maximum tokens to generate
Example:
{
"action": "generate_text",
"parameters": {
"prompt": "Write a short story about a robot",
"temperature": 0.8,
"max_tokens": 500
}
}
2. analyze_text
Analyze text content.
Parameters:
text(required): The text to analyzeanalysis_type(optional): Type of analysis ('sentiment', 'summary', 'keywords', or 'general')
Example:
{
"action": "analyze_text",
"parameters": {
"text": "The weather today is wonderful! I love how the sun is shining.",
"analysis_type": "sentiment"
}
}
3. chat
Have a conversation with Gemini.
Parameters:
messages(required): Array of message objects with 'role' and 'content'temperature(optional): Controls randomness (0.0 to 1.0)
Example:
{
"action": "chat",
"parameters": {
"messages": [
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing well! How can I help?"},
{"role": "user", "content": "Tell me about quantum computing"}
],
"temperature": 0.7
}
}
Error Handling
The server returns appropriate HTTP status codes and error messages:
200: Successful request400: Bad request (missing or invalid parameters)500: Server error (API issues, etc.)
Testing
Use the included test script to test various functionalities:
# Test all functionalities
python test_client.py
# Test specific functionality
python test_client.py text # Test text generation
python test_client.py analyze # Test text analysis
python test_client.py chat # Test chat functionality
MCP Protocol Specification
The Model Context Protocol implemented here follows these specifications:
-
Request Format:
action: String specifying the operationparameters: Object containing action-specific parameters
-
Response Format:
result: Object containing the operation resulterror: String explaining any error (when applicable)
License
MIT License
Quick Start
Clone the repository
git clone https://github.com/amitsh06/mcp-serverInstall dependencies
cd mcp-server
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.