
think mcp host
Think MCP Host (AI·Zen·Love) - A Model Context Protocol (MCP) based intelligent agent application that enables seamless integration of MCP resources, tools, and prompts into conversations, powered by various LLMs, VLMs, and reasoning models.
Repository Info
About This Server
Think MCP Host (AI·Zen·Love) - A Model Context Protocol (MCP) based intelligent agent application that enables seamless integration of MCP resources, tools, and prompts into conversations, powered by various LLMs, VLMs, and reasoning models.
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Think MCP Host (AI·Zen·Love)
Think MCP Host (AI·Zen·Love) is a Model Context Protocol (MCP) based intelligent agent application that supports various types of large language models, including standard conversational models (LLM), vision language models (VLM), and reasoning models.
!Terminal Interface
!Chat Interface
Features
-
Complete MCP (Model Context Protocol) Implementation
- Full MCP architecture support (Host/Client/Server)
- Comprehensive MCP resource types support
- Resources: Dynamic integration of external content
- Prompts: Template-based system prompts
- Tools: AI-powered function calls
- Dynamic MCP command insertion anywhere in conversations
- Seamless integration of resources into context
- On-demand prompt template usage
- Direct tool execution within chat
- Standalone MCP tool execution support
-
Extensive Model Support
- LLM (Language Models)
- Text conversations and content generation
- Programming and code assistance
- Document writing and analysis
- VLM (Vision Language Models)
- Image understanding and analysis
- Visual content processing
- Reasoning Models
- Complex logical analysis
- Professional domain reasoning
- Multiple provider support (DeepSeek, OpenAI, OpenRouter, etc.)
- LLM (Language Models)
-
Advanced Conversation Management
- Automatic conversation history saving
- Manual save options with countdown timer
- Historical conversation loading
- Multiple export format support
-
System Features
- Rich Terminal Interface
- Beautiful markdown rendering in terminal
- Syntax highlighting for code blocks
- Unicode and emoji support
- Interactive command suggestions
- Cross-Platform Support
- Full functionality on Windows, macOS, and Ubuntu
- Native installation support for each platform
- Consistent user experience across systems
- Command-line interface
- Debug mode support
- Flexible exit options with save/discard choices
- Rich Terminal Interface
Usage Guide
Running Mode Selection
The program supports two main running modes:
-
Chat Mode (Default)
- Used for natural language dialogue
- Supports multiple LLM models
- Can use MCP enhancement features
-
Tool Mode
- Used for using specific AI tools
- Directly calls functions provided by MCP server
Detailed Usage Process
- Select Running Mode
!alt text
- After program startup, you will be prompted to select running mode
- Enter
1to select Chat mode - Enter
2to select Tool mode
-
Chat Mode Setup Process
-
Select LLM Model !alt text
- System will display available model list
- Enter corresponding number to select model
- Supported models include DeepSeek, Silicon Flow, Volcano Engine, etc.
-
Choose Start Method !alt text
- Option 1: Set system prompt, then start new conversation
- Option 2: Directly start new conversation (default)
- Option 3: Load historical conversation
-
System Prompt Setting (if Option 1 was selected) !alt text
- Can input custom system prompt
- Supports using
->mcpcommand to insert MCP resources
-
Load Historical Conversation (if Option 3 was selected) !alt text
- System will display available historical conversation list
- Select conversation record to load
-
-
Tool Mode Setup Process
-
Select MCP Client !alt text
- System will display available MCP client list
- Select client to use
-
Select Tool !alt text
- Displays tool list provided by selected client
- Select specific tool to use
-
Execute Tool !alt text !alt text
- Provide necessary parameters according to tool requirements
- View tool execution results
-
Continue or Exit
- Choose whether to continue using other tools
- Can switch back to Chat mode at any time
-
Basic Chat Mode
- Start Conversation
- Directly input text to converse
- Use
Ctrl+Cto exit program
MCP Enhanced Mode
During conversation, you can use the ->mcp command to use MCP's enhancement features. Steps are as follows:
-
Activate MCP Command
- Input
->mcpalone and press Enter in conversation - System will guide you through subsequent selections
- Input
-
Select MCP Client !alt text
- System will display available MCP client list
- Select client to use
-
Select MCP Feature Type !alt text System will prompt you to select one of three types:
-
Resources !alt text
- Input
1to select - Used for selecting and referencing external resources (like images, documents, etc.)
- Returns format:
->mcp_resources[client_name]:resourceURI
- Input
-
Prompts !alt text
- Input
2to select - Used for selecting predefined prompt templates
- Returns format:
->mcp_prompts[client_name]:prompt_name{parameters}
- Input
-
Tools !alt text
- Input
3to select - Used for selecting and using specific AI tools
- Returns format:
->mcp_tools[client_name]:tool_name{parameters}
- Input
-
-
Complete Selection !alt text
- After selection is complete, system will insert corresponding MCP command in conversation
- You can continue editing message, or send directly
-
Select Running Mode
!alt text
- After program startup, you will be prompted to select running mode
- Enter
1to select Chat mode - Enter
2to select Tool mode
-
Chat Mode Setup Process
-
Select LLM Model !alt text
- System will display available model list
- Enter corresponding number to select model
- Supported models include DeepSeek, Silicon Flow, Volcano Engine, etc.
-
Choose Start Method !alt text
- Option 1: Set system prompt, then start new conversation
- Option 2: Directly start new conversation (default)
- Option 3: Load historical conversation
-
System Prompt Setting (if Option 1 was selected) !alt text
- Can input custom system prompt
- Supports using
->mcpcommand to insert MCP resources
-
Load Historical Conversation (if Option 3 was selected) !alt text
- System will display available historical conversation list
- Select conversation record to load
-
-
Tool Mode Setup Process
-
Select MCP Client !alt text
- System will display available MCP client list
- Select client to use
-
Select Tool !alt text
- Displays tool list provided by selected client
- Select specific tool to use
-
Execute Tool !alt text !alt text
- Provide necessary parameters according to tool requirements
- View tool execution results
-
Continue or Exit
- Choose whether to continue using other tools
- Can switch back to Chat mode at any time
-
Basic Chat Mode
- Start Conversation
- Directly input text to converse
- Use
Ctrl+Cto exit program
MCP Enhanced Mode
During conversation, you can use the ->mcp command to use MCP's enhancement features. Steps are as follows:
-
Activate MCP Command
- Input
->mcpalone and press Enter in conversation - System will guide you through subsequent selections
- Input
-
Select MCP Client !alt text
- System will display available MCP client list
- Select client to use
-
Select MCP Feature Type !alt text System will prompt you to select one of three types:
-
Resources !alt text
- Input
1to select - Used for selecting and referencing external resources (like images, documents, etc.)
- Returns format:
->mcp_resources[client_name]:resourceURI
- Input
-
Prompts !alt text
- Input
2to select - Used for selecting predefined prompt templates
- Returns format:
->mcp_prompts[client_name]:prompt_name{parameters}
- Input
-
Tools !alt text
- Input
3to select - Used for selecting and using specific AI tools
- Returns format:
->mcp_tools[client_name]:tool_name{parameters}
- Input
-
-
Complete Selection !alt text
- After selection is complete, system will insert corresponding MCP command in conversation
- You can continue editing message, or send directly
Installation and Running
Development Installation
Before installing from package repositories, you can install the project directly from source for development:
# Clone the repository
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Linux/macOS
# or
.venv\Scripts\Activate.ps1 # On Windows with PowerShell
# Install in development mode with pip
pip install -e .
# or with uv (recommended)
uv pip install -e .
Windows
- Installation methods
- Download and double-click
AI-Zen-Love.exe - Or install and run via command line:
- Download and double-click
# Install uv using pip
python -m pip install uv
# Clone the project and install
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
python -m venv .venv
.venv\Scripts\Activate.ps1
uv pip install -e .
- Configuration file locations
- LLM configuration:
C:\Users\your-username\.think-llm-client\config\servers_config.json - MCP configuration:
C:\Users\your-username\.think-mcp-client\config\mcp_config.json - History records:
C:\Users\your-username\.think-mcp-host\command_history\
- LLM configuration:
macOS
- Installation methods
- Download and double-click
AI-Zen-Love.app - Or install and run via terminal:
- Download and double-click
# Install uv
python3 -m pip install uv
# Clone the project and install
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
python3 -m venv .venv
source .venv/bin/activate
uv pip install -e .
- Configuration file locations
- LLM configuration:
/Users/your-username/.think-llm-client/config/servers_config.json - MCP configuration:
/Users/your-username/.think-mcp-client/config/mcp_config.json - History records:
/Users/your-username/.think-mcp-host/command_history/
- LLM configuration:
Configuration Details
Model Configuration
The project supports three types of models:
-
LLM (Language Models)
- Used for: Text conversations, code writing, document generation
- Examples: DeepSeek Chat, GPT-4
-
VLM (Vision Language Models)
- Used for: Image understanding and analysis
- Examples: GPT-4-Vision, Qwen-VL-Plus
-
Reasoning Models
- Used for: Complex reasoning and professional analysis
- Examples: DeepSeek Reasoner, DeepSeek-R1
LLM Configuration
The configuration file uses JSON format and needs to be configured according to different model types:
{
"llm": {
"providers": {
"deepseek": {
"api_key": "<DEEPSEEK_API_KEY>",
"api_url": "https://api.deepseek.com",
"model": {
"deepseek-chat": {
"max_completion_tokens": 8192
}
}
}
}
},
"vlm": {
"providers": {
"openai": {
"api_key": "<OPENAI_API_KEY>",
"api_url": "https://api.openai.com/v1",
"model": {
"gpt-4-vision": {
"max_completion_tokens": 4096
}
}
}
}
},
"reasoning": {
"providers": {
"deepseek": {
"api_key": "<DEEPSEEK_API_KEY>",
"api_url": "https://api.deepseek.com",
"model": {
"deepseek-reasoner": {
"max_completion_tokens": 8192,
"temperature": 0.6
}
}
}
}
}
}
Configuration explanation:
- Choose the configuration area according to model type (llm/vlm/reasoning)
- Multiple providers can be configured under each type Configuration instructions for different providers are as follows:
- DeepSeek Documentation: https://api-docs.deepseek.com/en/
- Silicon Flow Documentation: https://docs.siliconflow.cn/en/userguide/quickstart#4-siliconcloud-api-genai
- Volcano Engine Documentation: https://www.volcengine.com/docs/82379/1399008
- Each provider needs to configure:
api_key: API keyapi_url: API server addressmodel: Specific model configurationmax_completion_tokens: Maximum output lengthtemperature: Temperature parameter (optional)
MCP Server Configuration
MCP (Model Context Protocol) server configuration example:
{
"mcpServers": {
"think-mcp": {
"command": "/opt/homebrew/bin/uv",
"args": [
"--directory",
"/Users/thinkthinking/src_code/nas/think-mcp",
"run",
"think-mcp"
]
}
}
}
MCP Commands
The following MCP command formats can be used in conversations:
- Interactive Selection
->mcp
This will start an interactive selection interface, guiding you to choose:
- MCP client
- Operation type (Resources/Prompts/Tools)
- Specific resource/prompt/tool
- Related parameters (if needed)
- Direct Usage
# Use resources
->mcp_resources[client_name]:resource_uri
# Use prompts
->mcp_prompts[client_name]:prompt_name{param1:value1,param2:value2}
# Use tools
->mcp_tools[client_name]:tool_name{param1:value1,param2:value2}
Examples:
# Use prompts
->mcp_prompts[think-mcp]:agent-introduction{agent_name:AI Assistant,agent_description:A friendly AI assistant}
# Use tools
->mcp_tools[think-mcp]:analyze_content{text:This is a test text}
Features
- Support for multiple MCP commands in the same input
- Commands can be edited and modified at any time
- Parameters support flexible key-value pair format
- Friendly error prompts
### MCP Commands
The following MCP command formats can be used in conversations:
1. Interactive Selection
```bash
->mcp
This will start an interactive selection interface, guiding you to choose:
- MCP client
- Operation type (Resources/Prompts/Tools)
- Specific resource/prompt/tool
- Related parameters (if needed)
- Direct Usage
# Use resources
->mcp_resources[client_name]:resource_uri
# Use prompts
->mcp_prompts[client_name]:prompt_name{param1:value1,param2:value2}
# Use tools
->mcp_tools[client_name]:tool_name{param1:value1,param2:value2}
Examples:
# Use prompts
->mcp_prompts[think-mcp]:agent-introduction{agent_name:AI Assistant,agent_description:A friendly AI assistant}
# Use tools
->mcp_tools[think-mcp]:analyze_content{text:This is a test text}
Features
- Support for multiple MCP commands in the same input
- Commands can be edited and modified at any time
- Parameters support flexible key-value pair format
- Friendly error prompts
Releasing New Versions
To release a new version, follow these steps:
-
Update the version number:
- Update the
versionfield inpyproject.toml - Follow Semantic Versioning
- Update the
-
Commit changes:
git add pyproject.toml
git commit -m "chore: bump version to x.x.x"
- Create version tag:
git tag vx.x.x
git push origin vx.x.x
Quick Start
Clone the repository
git clone https://github.com/thinkthinking/think-mcp-hostInstall dependencies
cd think-mcp-host
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.