dev-johnny-gh
MCP Serverdev-johnny-ghpublic

mcp server demo

**摘要:** `mcp-server-demo` 是一个使用 LibreChat 和 Ollama 的聊天服务器示例,使用户能够创建具有 MongoDB 集成和 Web UI 的 IP 感知聊天代理。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
TypeScript
Language
MIT License
License

About This Server

**摘要:** `mcp-server-demo` 是一个使用 LibreChat 和 Ollama 的聊天服务器示例,使用户能够创建具有 MongoDB 集成和 Web UI 的 IP 感知聊天代理。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Installation

  1. cd IpServer && npm install && npm run build && npm run start

  2. install a local mongodb server and serve it on mongodb://127.0.0.1:27017

  3. cd LibreChat && git clone git@github.com:danny-avila/LibreChat.git && mv .env.example .env && npm install && npm run frontend && npm run backend

  4. add following configuration to your librechat.yaml file:

mcpServers:
  ipServer:
    # type: sse # type can optionally be omitted
    url: http://localhost:3000/sse
    timeout: 60000 # 1 minute timeout for this server, this is the default timeout for MCP servers.

endpoints:
  custom:
    - name: "Ollama"
      apiKey: "ollama"
      # use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
      baseURL: "http://localhost:11434/v1/chat/completions"
      models:
        default:
          [
            "qwen2.5:3b-instruct-q4_K_M",
            "mistral:7b-instruct-q4_K_M",
            "gemma:7b-instruct-q4_K_M",
          ]
        # fetching list of models is supported but the `name` field must start
        # with `ollama` (case-insensitive), as it does in this example.
        fetch: true
      titleConvo: true
      titleModel: "current_model"
      summarize: false
      summaryModel: "current_model"
      forcePrompt: false
      modelDisplayLabel: "Ollama"
  1. download and run ollama, download a model from https://ollama.ai/models/ and serve ollama on http://localhost:11434/

Usage

  1. Visit http://localhost:3080/ to see the LibreChat UI.

  2. Create a new agent with the name "Ollama" and select the ollama as the model provider and select a model

  3. Click on the Add Tools button below and add the get-external-ip, get-local-ip-v6, get-external-ip-v6, get-local-ip tools

  4. Ask agent what's my local ip address? / what's my external ip address? / what's my external ipv6 address? / what's my internal ipv6 address?

  5. Agent should invoke your tools and return the results.

Quick Start

1

Clone the repository

git clone https://github.com/dev-johnny-gh/mcp-server-demo
2

Install dependencies

cd mcp-server-demo
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerdev-johnny-gh
Repomcp-server-demo
LanguageTypeScript
LicenseMIT License
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation