rench
MCP Serverrenchpublic

any to mcp

一个用于将大语言模型与MCP服务器桥接的Node.js框架。

Repository Info

1
Stars
0
Forks
1
Watchers
0
Issues
TypeScript
Language
-
License

About This Server

一个用于将大语言模型与MCP服务器桥接的Node.js框架。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

any-to-mcp

A Node.js framework to bridge Large Language Models (LLMs) with Model Context Protocol (MCP) servers.

Overview

any-to-mcp is a framework that enables seamless integration between LLMs and MCP servers. It allows LLMs to execute tools via the MCP protocol without requiring desktop applications like Claude Desktop or Cursor. The framework handles parsing LLM responses for XML tags, executing the appropriate MCP tool calls, and managing the conversation flow.

Features

  • LLM Adapters: Connect to different LLM providers (OpenAI, Anthropic)
  • XML Response Parsing: Parse LLM responses for MCP-related XML tags
  • MCP Server Management: Automatically handle MCP server connections
  • Tool Execution: Route tool calls to the appropriate MCP servers
  • Step-by-step Processing: Coordinate multi-step LLM-MCP interactions
  • Follow-up Questions: Support follow-up questions to users

Installation

npm install any-to-mcp

Quick Start

import { createAnyToMcp, CoordinatorState } from 'any-to-mcp';

async function main() {
  // Create and initialize the framework
  const coordinator = await createAnyToMcp({
    llm: {
      provider: 'openai',
      apiKey: 'your-openai-api-key',
      model: 'gpt-4',
    },
    // Optional MCP server configuration
    mcp: {
      name: 'example-server',
      url: 'http://localhost:3000/mcp',
      type: 'http',
    },
    // Maximum iterations before forcing completion
    maxIterations: 10,
  });

  // Start a conversation with a user message
  const result = await coordinator.start('Hello, can you help me find the weather in San Francisco?');

  // Process the result based on state
  if (result.state === CoordinatorState.COMPLETED) {
    console.log('Final answer:', result.message);
  } else if (result.state === CoordinatorState.WAITING_FOR_USER) {
    console.log('Question for user:', result.message);
    
    // User can respond with:
    // const nextResult = await coordinator.provideUserResponse('User response');
  }

  // Clean up when done
  await coordinator.close();
}

main();

Supported LLM Providers

  • OpenAI: GPT-3.5, GPT-4, and other OpenAI models
  • Anthropic: Claude models (Opus, Sonnet, etc.)
  • Custom HTTP: Support for custom HTTP endpoints

XML Tags Supported

The framework can detect and process the following XML tags in LLM responses:

  • <use_mcp_tool>: Execute a tool on an MCP server
  • <access_mcp_resource>: Access a resource from an MCP server
  • <ask_followup_question>: Ask a follow-up question to the user
  • <attempt_completion>: Finish the conversation with a result
  • <fetch_instructions>: Fetch instructions for a specific task
  • <thinking>: Internal thought process (ignored for execution)

Advanced Usage

Custom LLM Adapters

import { BaseLLMAdapter, LLMRequestOptions, LLMResponse, LLMResponseChunk } from 'any-to-mcp';

// Create a custom LLM adapter
class CustomAdapter extends BaseLLMAdapter {
  async sendRequest(messages, options) {
    // Implement your custom request logic
    return {
      content: 'Response from custom LLM',
      model: 'custom-model',
    };
  }

  async sendStreamingRequest(messages, callback, options) {
    // Implement your custom streaming logic
  }
}

// Use the custom adapter
const coordinator = await createAnyToMcp({
  llm: {
    provider: 'custom',
    // Custom adapter will be used
  },
});

Working with MCP Servers

import { MCPManager } from 'any-to-mcp';

// Create an MCP manager
const manager = new MCPManager({
  configPath: './mcpserver.json',
});

// Initialize and connect
await manager.initialize();
await manager.connectAll();

// Execute a tool
const result = await manager.executeTool({
  serverName: 'example-server',
  toolName: 'example-tool',
  arguments: { param1: 'value1' },
});

// Clean up
await manager.disconnectAll();

Examples

Check the examples directory for more usage examples:

  • simple-chat.ts: Basic chat interface using the framework

API Reference

Core Functions

  • createAnyToMcp(config): Create and initialize the framework
  • coordinator.start(userMessage): Start a conversation
  • coordinator.provideUserResponse(response): Provide user input
  • coordinator.getState(): Get the current state
  • coordinator.reset(): Reset the conversation
  • coordinator.close(): Clean up resources

State Management

The coordinator can be in one of the following states:

  • CoordinatorState.IDLE: Ready for a new conversation
  • CoordinatorState.PROCESSING: Processing a request
  • CoordinatorState.WAITING_FOR_USER: Waiting for user input
  • CoordinatorState.COMPLETED: Conversation completed
  • CoordinatorState.ERROR: Error occurred

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Quick Start

1

Clone the repository

git clone https://github.com/rench/any-to-mcp
2

Install dependencies

cd any-to-mcp
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerrench
Repoany-to-mcp
LanguageTypeScript
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation