
mcp use cli
一个基于 MCP 协议的命令行聊天工具,支持 AI 使用网页浏览等功能。
Repository Info
About This Server
一个基于 MCP 协议的命令行聊天工具,支持 AI 使用网页浏览等功能。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Chat CLI
A simple interactive chat application using MCP (Model Context Protocol) that allows AI to access tools like web browsing.
Features
- Interactive command-line chat interface
- Web browsing capability through Playwright MCP
- Conversation memory to maintain context
- Support for OpenAI and Groq models
Prerequisites
- Python 3.11+
- uv package manager
- API keys for LLM providers (OpenAI/Groq)
Installation
- Clone this repository:
git clone https://github.com/cthuaung/mcpdemohttps://github.com/cthuaung/mcp-use-cli.git
cd mcp-use-cli
- Set up a virtual environment with uv:
uv init
uv venv
- Install dependencies using uv:
uv add python-dotenv langchain-groq langchain-openai mcp-use
- Create a
.envfile with your API keys:
OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
- Create a
browser_mcp.jsonconfiguration file:
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {
"DISPLAY": ":1"
}
}
}
}
Usage
Run the application:
uv run app.py
Chat Commands
- Type your messages normally to chat with the AI
- Type
exitorquitto end the conversation - Type
clearto clear the conversation history
How It Works
This application uses the MCP-Use library to create an agent that can access tools through the Model Context Protocol. The agent uses LangChain and supports multiple LLM providers like OpenAI and Groq.
The main features include:
- Built-in conversation memory for contextual interactions
- Web browsing capabilities through Playwright MCP
- Simple command-line interface for easy interaction
Customization
You can modify the LLM model by uncommenting and configuring different models in the app.py file:
# Choose your preferred model
llm = ChatOpenAI(model="gpt-4o")
# llm = ChatGroq(model="llama-3.3-70b-versatile")
License
MIT
Credits
This project uses the MCP-Use library by Pietro Zullo.
Quick Start
Clone the repository
git clone https://github.com/cthuaung/mcp-use-cliInstall dependencies
cd mcp-use-cli
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.