
langoctopus mcp
一个复古风格的聊天界面,用于与 LangGraph MCP 智能代理交互。
Repository Info
About This Server
一个复古风格的聊天界面,用于与 LangGraph MCP 智能代理交互。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
LangOctopus MCP Project
A retro-style chat interface for interacting with the LangGraph MCP agent.
Project Structure
/
├── backend/ # Flask backend server for the UI
│ ├── app.py
│ └── ... (removed requirements.txt)
├── config/ # Configuration files (if any)
├── doc/ # Project documentation files
├── frontend/ # React frontend application
│ ├── public/
│ ├── src/
│ ├── package.json
│ └── package-lock.json # npm lock file
├── lambda/ # AWS SAM application for deploying MCP servers as Lambda
│ ├── math/
│ ├── weather/
│ ├── authorizer/
│ ├── client.py # Lambda-specific agent entrypoint
│ ├── client_adapter.py # Adapter for Lambda client
│ └── README.md
├── scripts/ # Utility scripts
│ └── run_ui.sh # Runs frontend and backend
├── src/ # Core agent source code
│ ├── agent/ # LangGraph agent implementation
│ │ └── client.py
│ ├── mcp_servers/ # MCP server implementations (local)
│ │ ├── math_server.py
│ │ └── weather_server.py
│ └── utils/
├── tests/ # Test files
│ └── run_tests.py # Test runner
├── trash/ # Directory for deleted/obsolete files
├── .cursor/ # Cursor configuration and rules
├── .git/ # Git directory
├── .gitignore
├── .env.example # Example environment variables
├── pyproject.toml # Python dependencies (Poetry)
├── poetry.lock # Poetry lock file
├── README.md # This file
├── run.py # Main script to run the agent (local servers)
└── test_queries.py # Example queries for the agent
Setup
-
Clone the repository:
git clone <repository-url> cd langoctopus-mcp -
Install Python Dependencies:
- Ensure you have Python 3.10+ and Poetry installed.
- Install dependencies:
poetry install
-
Install Frontend Dependencies:
- Ensure you have Node.js and npm installed.
- Navigate to the frontend directory and install dependencies:
cd frontend npm install cd ..
-
Environment Variables:
- Copy
.env.exampleto.env. - Fill in your
GOOGLE_API_KEYin the.envfile.
- Copy
Running the Application
Option 1: Local MCP Servers (Recommended for Development)
-
Run the Agent and Local Servers: This script handles starting the local Math and Weather MCP servers and then runs the main agent (
src/agent/client.py).poetry run python run.pyThe agent will be ready for input in the terminal.
-
Run the UI: Open another terminal and run the UI script. This starts the Flask backend and serves the React frontend.
./scripts/run_ui.shAccess the UI at
http://localhost:5000(or the specified port).
Option 2: Agent with Deployed Lambda MCP Servers
Refer to the lambda/README.md for instructions on deploying the SAM application and running the agent (lambda/client.py) against the deployed functions.
Running Tests
cd tests
poetry run python run_tests.py
cd ..
Using the Agent
Once the agent is running (either via run.py or lambda/client.py), you can interact with it in the terminal where it was started. Type your queries and press Enter. Type exit to quit.
Example queries are available in test_queries.py.
Project Overview
This project consists of:
- Backend: A Flask server with Socket.IO integration that connects to the LangGraph MCP agent.
- Frontend: A React application with a retro-style UI featuring robot avatars and a chat interface.
- Agent: The existing LangGraph MCP agent that can answer math and weather questions.
Directory Structure
/src- Core agent functionality/agent- LangGraph agent implementation/mcp_servers- MCP server implementations (math, weather)
/frontend- React-based retro-style UI/backend- Flask server with Socket.IO integration/run.sh- Script to run the agent servers/run_ui.sh- Script to run the frontend and backend servers
Features
- Retro-style UI with custom robot avatars
- Real-time communication using Socket.IO
- Automated conversation flow (questions and answers)
- Integration with the existing agent for math and weather questions
- Responsive design with animations
Development
Backend
The backend is a Flask application with Socket.IO integration. It connects to the existing LangGraph MCP agent to process questions.
Key files:
backend/app.py- Main Flask application with Socket.IO setup
Frontend
The frontend is a React application with a retro-style UI.
Key components:
App.jsx- Main application componentLeftRobot.jsx- Left robot component (questioner)RightRobot.jsx- Right robot component (answerer)ChatHistory.jsx- Chat history display component
Adding Custom Questions
You can modify the test_queries.py file to add your own questions to the sample set.
Troubleshooting
- Port conflicts: The script will detect if ports 3001 or 5001 are already in use and will automatically kill the processes using them.
- Missing dependencies: The script will attempt to install missing dependencies automatically.
License
[Your license information here]
Quick Start
Clone the repository
git clone https://github.com/bprzybys-nc/langoctopus-mcpInstall dependencies
cd langoctopus-mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.