cthuaung
MCP Servercthuaungpublic

mcp use cli

一个基于 MCP 协议的命令行聊天工具,支持 AI 使用网页浏览等功能。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Python
Language
-
License

About This Server

一个基于 MCP 协议的命令行聊天工具,支持 AI 使用网页浏览等功能。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

MCP Chat CLI

A simple interactive chat application using MCP (Model Context Protocol) that allows AI to access tools like web browsing.

Features

  • Interactive command-line chat interface
  • Web browsing capability through Playwright MCP
  • Conversation memory to maintain context
  • Support for OpenAI and Groq models

Prerequisites

  • Python 3.11+
  • uv package manager
  • API keys for LLM providers (OpenAI/Groq)

Installation

  1. Clone this repository:
git clone https://github.com/cthuaung/mcpdemohttps://github.com/cthuaung/mcp-use-cli.git
cd mcp-use-cli
  1. Set up a virtual environment with uv:
uv init
uv venv
  1. Install dependencies using uv:
uv add python-dotenv langchain-groq langchain-openai mcp-use
  1. Create a .env file with your API keys:
OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
  1. Create a browser_mcp.json configuration file:
{
  "mcpServers": {
    "playwright": {
      "command": "npx",
      "args": ["@playwright/mcp@latest"],
      "env": {
        "DISPLAY": ":1"
      }
    }
  }
}

Usage

Run the application:

uv run app.py

Chat Commands

  • Type your messages normally to chat with the AI
  • Type exit or quit to end the conversation
  • Type clear to clear the conversation history

How It Works

This application uses the MCP-Use library to create an agent that can access tools through the Model Context Protocol. The agent uses LangChain and supports multiple LLM providers like OpenAI and Groq.

The main features include:

  • Built-in conversation memory for contextual interactions
  • Web browsing capabilities through Playwright MCP
  • Simple command-line interface for easy interaction

Customization

You can modify the LLM model by uncommenting and configuring different models in the app.py file:

# Choose your preferred model
llm = ChatOpenAI(model="gpt-4o")
# llm = ChatGroq(model="llama-3.3-70b-versatile")

License

MIT

Credits

This project uses the MCP-Use library by Pietro Zullo.

Quick Start

1

Clone the repository

git clone https://github.com/cthuaung/mcp-use-cli
2

Install dependencies

cd mcp-use-cli
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownercthuaung
Repomcp-use-cli
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation