michaelwybraniec
MCP Servermichaelwybraniecpublic

mcp server gh models helper

帮助用户与 GitHub 和 AzureML 上的语言模型交互并比较其响应。

Repository Info

0
Stars
0
Forks
0
Watchers
1
Issues
TypeScript
Language
-
License

About This Server

帮助用户与 GitHub 和 AzureML 上的语言模型交互并比较其响应。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

GitHub Models Helper MCP Server

This MCP server helps you interact with and compare different language models available via GitHub Models and AzureML, including OpenAI, Microsoft, Meta, Mistral, and more.

Features

  • List available language models with metadata
  • Compare responses from different models for the same prompt
  • Filter and sort models by various criteria
  • Comprehensive error handling and fallbacks
  • Visualize model comparisons (see below)

Getting Started

  1. Install dependencies:
    npm install
    
  2. Set up environment variables: Copy .env.template to .env and add your GitHub token:
    GITHUB_TOKEN=your_github_personal_access_token
    
  3. Build the project:
    npm run build
    
  4. Run the MCP server in development mode:
    npx @modelcontextprotocol/inspector dist/index.js
    
  5. Add the MCP server to Claude Desktop: In claude_desktop_config.json:
    {
       "mcpServers": {
          "GitHub Models Helper": {
             "command": "node",
             "args": [
             "/absolute/path/to/gh-models-helper/dist/index.js"
             ],
             "env": {
             "GITHUB_TOKEN": "your_github_personal_access_token"
             }
          }
       }
    }
    

Available Phi-3 Models

Model IDDisplay NameContext WindowSummary
Phi-3-medium-128k-instructPhi-3-medium instruct (128k)131,072Same Phi-3-medium model, but with a larger context size
Phi-3-medium-4k-instructPhi-3-medium instruct (4k)4,09614B parameters, better quality than Phi-3-mini

Note: There is currently no model named "Phi-4" or "Phi-3-mini-4k-instruct" in the available list. Use the above IDs for comparisons.

Visualizing Model Comparisons

You can compare how different models respond to the same prompt and visualize the results. For example, to compare three models:

  • Phi-3-medium-128k-instruct
  • gpt-4o-mini
  • Mistral-small

Example prompt:

Explain the difference between AI and machine learning.

Sample output visualization:

ModelResponse
Phi-3-medium-128k-instruct...
gpt-4o-mini...
Mistral-small...

You can use your own prompt and models. The server will return a JSON object with the responses, which you can render as a table or chart in your application.

Example Prompts

  • "list all available phi-3 models"
  • "compare Phi-3-medium-4k-instruct and Mistral-small on this prompt: how many ns in bananasss??"
  • "Do a comparison between the Phi-3-medium-128k-instruct, gpt-4o-mini, and Mistral-small models"

For more details, see the code and documentation in project.md.

Quick Start

1

Clone the repository

git clone https://github.com/michaelwybraniec/mcp-server-gh-models-helper
2

Install dependencies

cd mcp-server-gh-models-helper
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownermichaelwybraniec
Repomcp-server-gh-models-helper
LanguageTypeScript
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation