
mcp server gh models helper
帮助用户与 GitHub 和 AzureML 上的语言模型交互并比较其响应。
Repository Info
About This Server
帮助用户与 GitHub 和 AzureML 上的语言模型交互并比较其响应。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
GitHub Models Helper MCP Server
This MCP server helps you interact with and compare different language models available via GitHub Models and AzureML, including OpenAI, Microsoft, Meta, Mistral, and more.
Features
- List available language models with metadata
- Compare responses from different models for the same prompt
- Filter and sort models by various criteria
- Comprehensive error handling and fallbacks
- Visualize model comparisons (see below)
Getting Started
- Install dependencies:
npm install - Set up environment variables:
Copy
.env.templateto.envand add your GitHub token:GITHUB_TOKEN=your_github_personal_access_token - Build the project:
npm run build - Run the MCP server in development mode:
npx @modelcontextprotocol/inspector dist/index.js - Add the MCP server to Claude Desktop:
In
claude_desktop_config.json:{ "mcpServers": { "GitHub Models Helper": { "command": "node", "args": [ "/absolute/path/to/gh-models-helper/dist/index.js" ], "env": { "GITHUB_TOKEN": "your_github_personal_access_token" } } } }
Available Phi-3 Models
| Model ID | Display Name | Context Window | Summary |
|---|---|---|---|
| Phi-3-medium-128k-instruct | Phi-3-medium instruct (128k) | 131,072 | Same Phi-3-medium model, but with a larger context size |
| Phi-3-medium-4k-instruct | Phi-3-medium instruct (4k) | 4,096 | 14B parameters, better quality than Phi-3-mini |
Note: There is currently no model named "Phi-4" or "Phi-3-mini-4k-instruct" in the available list. Use the above IDs for comparisons.
Visualizing Model Comparisons
You can compare how different models respond to the same prompt and visualize the results. For example, to compare three models:
Phi-3-medium-128k-instructgpt-4o-miniMistral-small
Example prompt:
Explain the difference between AI and machine learning.
Sample output visualization:
| Model | Response |
|---|---|
| Phi-3-medium-128k-instruct | ... |
| gpt-4o-mini | ... |
| Mistral-small | ... |
You can use your own prompt and models. The server will return a JSON object with the responses, which you can render as a table or chart in your application.
Example Prompts
- "list all available phi-3 models"
- "compare Phi-3-medium-4k-instruct and Mistral-small on this prompt: how many ns in bananasss??"
- "Do a comparison between the Phi-3-medium-128k-instruct, gpt-4o-mini, and Mistral-small models"
For more details, see the code and documentation in project.md.
Quick Start
Clone the repository
git clone https://github.com/michaelwybraniec/mcp-server-gh-models-helperInstall dependencies
cd mcp-server-gh-models-helper
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.