codeswhite
MCP Servercodeswhitepublic

qdrant ltm

基于 Qdrant 向量数据库的长期记忆 AI 聊天应用,支持上下文感知对话。

Repository Info

0
Stars
0
Forks
0
Watchers
1
Issues
TypeScript
Language
-
License

About This Server

基于 Qdrant 向量数据库的长期记忆 AI 聊天应用,支持上下文感知对话。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Qdrant LTM (Long Term Memory) AI Chat

A sophisticated AI chat application that leverages Qdrant vector database for long-term memory storage and retrieval, enabling context-aware conversations with persistent memory.

Features

  • 🤖 LLM chat interface with memory integration
  • 🧠 Long-term memory storage using Qdrant vector database
  • 📊 WIP: Memory visualization and management
  • 🔄 WIP: Context-aware conversations with memory retrieval

Tech Stack

  • Frontend: Next.js 14, TS, Tailwind CSS
  • Backend: NestJS, TS, OpenAI API
  • Database: Qdrant vector database
  • Embedding: Local opensource embedding model running with Transformers.js

Project Structure

qdrant-ltm/
├── frontend/             # Next.js frontend application
├── backend/              # Node.js backend application
│   └── src/
│       └── llm/          # LLM module
│          ├── memory/    # Qdrant memory service (Qdrant Client, OpenAI Client)
│          └── embedding/ # Embedding service (Local Embedding / OpenAI Embedding)
├── embedder/             # Local embedding server
├── compose.yml           # Docker compose file
└── .env                  # Environment variables

Setup

You can run the application with all dependencies using Docker Compose or manually by following the instructions below.

Prerequisites:

  • Docker

You may use the supplied compose.yml file to run the application with all dependencies.

  1. First copy .env.example to .env and fill in the required variables.

  2. Run:

docker compose up --build

Embedder Note: It might take a minute for the embedder to download a 500MB embedding model..

See logs with:

docker compose logs -f

Without Docker Compose (or local development)

Prerequisites:

  • pnpm (recommended)

You can run the servcies manually:

  1. Qdrant Vector DB
docker run -p 6333:6333 -v qdrant_storage:/qdrant/storage qdrant/qdrant
  1. Embedder
cd embedder
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
python app.py

Embedder Note: It might take a minute for the embedder to download a 500MB embedding model..

  1. Backend
cd backend
pnpm i
pnpm run start:dev
  1. Frontend
cd frontend
pnpm i
pnpm run dev

Usage

  1. Open the application in your browser at http://localhost:3000.
  2. Start a new session by clicking the "Start New Session" button
  3. Begin chatting with the AI assistant
  4. The system will automatically retrieve relevant memories from previous conversations

Memory Management

The application uses Qdrant vector database to store and retrieve memories. Memories are automatically created and retrieved based on:

  • Semantic similarity with current conversation
  • Timestamp relevance
  • Contextual importance

TODOs:

  • Improve memory TTL (better prompt engineering)
  • Make memories delete on TTL expiration -> Can become redundant if i introduce a "memory strength" metric
  • Implement human-like memory retention, maybe do a memory "strength" have it gradually decrease over time and increase when a memory is retrieved or a similar memory created
  • Allow assistant to query memory database (via MCP?) -> Exmaple use case: - "User: 'What else do you know about me?'", - Assistant calls a 'tool' to query memory database

Quick Start

1

Clone the repository

git clone https://github.com/codeswhite/qdrant-ltm
2

Install dependencies

cd qdrant-ltm
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownercodeswhite
Repoqdrant-ltm
LanguageTypeScript
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation