andrewsydney
MCP Serverandrewsydneypublic

mcp_cli_langgraph

Langgraph MCP Adapters

Repository Info

1
Stars
0
Forks
1
Watchers
0
Issues
Python
Language
Other
License

About This Server

Langgraph MCP Adapters

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

MCP CLI LangChain Demo

This project demonstrates how to use the langchain-mcp-adapters library to connect a LangChain agent with multiple Model Context Protocol (MCP) servers, allowing the agent to leverage tools provided by these servers.

Features

  • Loads MCP server configurations from a YAML file (mcp_servers_config.yaml).
  • Connects to multiple MCP servers concurrently (examples include local Python scripts for Math and Weather, and the official SQLite reference server).
  • Integrates MCP tools seamlessly into a LangChain agent using langchain-mcp-adapters.
  • Includes a simple demonstration script (main.py) to showcase loading configurations and fetching tools.
  • Provides an interactive chat interface (chat_interface.py) powered by a local Ollama model (llama3.3 by default) that can utilize the configured MCP tools.
  • The chat interface includes helpful custom commands:
    • /list_servers: List the names of servers defined in the configuration file.
    • /list_tools: List the names of all available tools and the server providing them.
    • /list_tools_details: List detailed information (name, description) for all tools, grouped by server.

Setup

  1. Clone this repository:
    git clone <your-repository-url>
    cd mcp_cli_langchain
    
  2. Python Environment: Requires Python 3.x. Create and activate a virtual environment:
    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
    
  3. Install Dependencies: Install required Python packages:
    pip install -r requirements.txt
    
    (Note: requirements.txt includes python-dotenv which is needed to load the .env file.)
  4. Install langchain-mcp-adapters (Editable): This project uses a potentially modified local version of the adapter library. Install it in editable mode:
    # Ensure you are in the project root directory (mcp_cli_langchain)
    cd langchain-mcp-adapters
    pip install -e .
    cd ..
    
  5. Clone MCP Reference Servers: The configuration uses the official SQLite reference server. Clone the repository into the project root:
    git clone https://github.com/modelcontextprotocol/servers.git
    
  6. Install uv: The SQLite server configuration uses uv to run. Install uv by following the instructions on https://github.com/astral-sh/uv.
  7. Setup Ollama:
    • Ensure Ollama is installed and the service is running.
    • Pull the required model (defaults to llama3.3 in chat_interface.py):
      ollama pull llama3.3
      

Configuration

  • MCP Servers: MCP server connections are defined in mcp_servers_config.yaml.
    • You can add, remove, or modify server entries in this file.
    • Pay attention to paths (e.g., for local script servers or the cloned servers directory), ensuring they are correct relative to the project root (mcp_cli_langchain).
    • The default configuration includes math, weather, and sqlite servers.
  • Ollama: Ollama settings for the chat interface are configured via the .env file in the project root.
    • OLLAMA_MODEL: Specifies the Ollama model to use (e.g., llama3.3).
    • OLLAMA_BASE_URL: The base URL for your running Ollama instance (e.g., http://localhost:11434).
    • OLLAMA_TEMPERATURE: Controls the creativity/randomness of the model's output (e.g., 0.8).
    • Create a .env file if it doesn't exist, based on the example:
    # .env example
    OLLAMA_MODEL=llama3.3
    OLLAMA_BASE_URL=http://localhost:11434
    OLLAMA_TEMPERATURE=0.8
    

Running the Project

Ensure your virtual environment is activated

Quick Start

1

Clone the repository

git clone https://github.com/andrewsydney/mcp_cli_langgraph
2

Install dependencies

cd mcp_cli_langgraph
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerandrewsydney
Repomcp_cli_langgraph
LanguagePython
LicenseOther
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation