
efflux backend
基于 LLM 的 Agent 聊天客户端后端,支持流式响应和聊天历史管理。
Repository Info
About This Server
基于 LLM 的 Agent 聊天客户端后端,支持流式响应和聊天历史管理。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Efflux - Backend
LLM Agent Chat Client Backend
English | 简体中文
Efflux is an LLM-based Agent chat client featuring streaming responses and chat history management. As an MCP Host, it leverages the Model Context Protocol to connect with various MCP Servers, enabling standardized tool invocation and data access for large language models.
Key Features
- Rapid Agent construction
- Dynamic MCP tool loading and invocation
- Support for multiple large language models
- Real-time streaming chat responses
- Chat history management
Online Demo
- 🏠 Efflux Homepage
- 🚀 Interactive Demo
Requirements
- Python 3.12+
- PostgreSQL
- uv (Python package & environment manager), installable via
pip install uv
Quick Start
- Clone the project
git clone git@github.com:isoftstone-data-intelligence-ai/efflux-backend.git
cd efflux-backend
- Install uv
pip install uv
- Reload dependencies
uv sync --reinstall
- Activate virtual environment
# Activate virtual environment
source .venv/bin/activate # MacOS/Linux
# Deactivate when needed
deactivate
- Configure environment variables
# Copy environment variable template
cp .env.sample .env
# Edit .env file, configure:
# 1. Database connection info (DATABASE_NAME, DATABASE_USERNAME, DATABASE_PASSWORD)
# 2. At least one LLM configuration (e.g., Azure OpenAI, Qwen, Doubao, or Moonshot)
- Select the LLM
# Edit core/common/container.py file
# Find the llm registration section, replace with any of the following models (Qwen by default):
# - QwenLlm: Qwen
# - AzureLlm: Azure OpenAI
# - DoubaoLlm: Doubao
# - MoonshotLlm: Moonshot
# Example: Using Azure OpenAI
from core.llm.azure_open_ai import AzureLlm
# ...
llm = providers.Singleton(AzureLlm)
- Start PostgreSQL database
# Method 1: If PostgreSQL is installed locally
# Simply start your local PostgreSQL service
# Method 2: Using Docker (example)
docker run -d --name local-postgres \
-e POSTGRES_DB=your_database_name \
-e POSTGRES_USER=your_username \
-e POSTGRES_PASSWORD=your_password \
-p 5432:5432 \
postgres
# Note: Ensure database connection info matches the configuration in your .env file
- Initialize database
# Create a new version and generate a migration file in alembic/versions
alembic revision --autogenerate -m "initial migration"
# Preview SQL to be executed:
alembic upgrade head --sql
# If preview looks good, execute migration
alembic upgrade head
- Initialize LLM template data
# Run initialization script
python scripts/init_llm_templates.py
- Start the service
python -m uvicorn main:app --host 0.0.0.0 --port 8000
Acknowledgments
This project utilizes the following excellent open-source projects and technologies:
- @modelcontextprotocol/mcp - Standardized open protocol for LLM data interaction
- @langchain-ai/langchain - LLM application development framework
- @sqlalchemy/sqlalchemy - Python SQL toolkit and ORM framework
- @pydantic/pydantic - Data validation and settings management
- @tiangolo/fastapi - Modern, fast web framework
- @aio-libs/aiohttp - Async HTTP client/server framework
- @sqlalchemy/alembic - Database migration tool for SQLAlchemy
- @astral-sh/uv - Ultra-fast Python package manager
- @python-colorlog/colorlog - Colored log output tool
- @jlowin/fastmcp - Python framework for building MCP servers
- @langchain-ai/langgraph - Framework for building stateful multi-agent LLM applications
Thanks to the developers and maintainers of these projects for their contributions to the open-source community.
Quick Start
Clone the repository
git clone https://github.com/isoftstone-data-intelligence-ai/efflux-backendInstall dependencies
cd efflux-backend
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.