
mcp_client
this is an mcp client with agentic behavior
Repository Info
About This Server
this is an mcp client with agentic behavior
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Client: Agentic Chatbot with Tool-Use and Streamlit UI
Overview
This project is an agentic chatbot client that can use external tools (such as SQLite and Puppeteer servers) to answer user queries, automate workflows, and perform multi-step reasoning. It features both a command-line interface and a modern Streamlit web UI. The chatbot is powered by an LLM (via Groq API) and can chain tool calls, passing outputs from one tool as inputs to the next.
Features
- Agentic LLM Chatbot: Uses an LLM to interpret user queries and decide when/how to use tools.
- Tool Chaining: Supports multi-step workflows where the output of one tool is used as input for another.
- SQLite Tooling: Query, update, and manage a SQLite database via an MCP server.
- Puppeteer Tooling: Automate browser actions (navigate, click, fill, screenshot, evaluate JS, etc.) via Puppeteer MCP server.
- Streamlit Web UI: Chat with the agent, view logs, and see tool outputs in a modern web interface.
- Command-Line Mode: Interact with the agent via the terminal.
- Logging: All actions and errors are logged to
app.logfor transparency and debugging.
Folder Structure
app_streamlit.py # Streamlit web app for chat and tool use
main.py # Main agentic chatbot logic (CLI entrypoint)
requirements.txt # Python dependencies
servers_config.json # Configuration for MCP tool servers
app.log # Log file (auto-generated)
test.db # SQLite database (auto-generated/used by sqlite server)
.env # Environment variables (API keys, etc.)
Setup Instructions
1. Python Version
- Windows users: You must use Python 3.10 or 3.11. Python 3.13+ is not supported for async subprocesses (required for tool servers).
- Download Python 3.11
2. Clone and Install
# Clone the repo (if not already)
cd path\to\your\projects
# Create and activate a virtual environment
py -3.11 -m venv venv
.\venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
3. Environment Variables
- Create a
.envfile in the project root with your LLM API key:LLM_API_KEY=your_groq_api_key_here
4. Configure Tool Servers
- Edit
servers_config.jsonto define which MCP servers to use. Example:{ "mcpServers": { "sqlite": { "command": "uvx", "args": ["mcp-server-sqlite", "--db-path", "./test.db"] }, "puppeteer": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-puppeteer"] } } } - Ensure you have Node.js and the required MCP servers installed globally or accessible via
npx/uvx.
Usage
Command-Line Chatbot
python main.py
- Type your message and press Enter.
- Type
exitorquitto stop. - The agent will use tools as needed and log all actions to
app.log.
Streamlit Web App
streamlit run app_streamlit.py
- Open the provided local URL in your browser.
- Chat with the agent, view logs, and see tool outputs.
- If you are on Windows with Python 3.13+, you will see an error message (use Python 3.11).
Tool Chaining (Agentic Mode)
- The agent can execute multiple tools in sequence, passing results between them.
- Example LLM tool call (handled automatically):
[ {"tool": "puppeteer_navigate", "arguments": {"url": "https://news.google.com/"}, "result_var": "page_content"}, {"tool": "summarize_text", "arguments": {"text": "$page_content"}, "result_var": "summary"}, {"tool": "write_query", "arguments": {"query": "INSERT INTO summaries (text) VALUES ('$summary')"}} ] - The agent will resolve
$page_contentand$summaryautomatically.
Logs
- All actions, tool calls, and errors are logged to
app.log. - In the Streamlit app, you can view logs in the "Show Log Output" expander.
Troubleshooting
- Python 3.13+ on Windows: Not supported for async subprocesses. Use Python 3.11.
- Tool server fails to start: Check your
servers_config.jsonand ensureuvx,npx, and the MCP servers are installed and on your PATH. - No tools available: Make sure servers are running and initialized (see logs for errors).
- API errors: Check your
.envand API key.
Credits
- Model Context Protocol (MCP)
- Streamlit
- Groq API
License
This project is for educational and research purposes. See individual tool/server repos for their licenses.
Quick Start
Clone the repository
git clone https://github.com/yupcoding1/mcp_clientInstall dependencies
cd mcp_client
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.