mcpflow
MCP Servermcpflowpublic

localmind

LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
TypeScript
Language
-
License

About This Server

LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

LocalMind

LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.

Local Development

Create a .env file in the backend folder:

APP_CONFIG_FILE_PATH=config.yaml
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create a config.yaml file in your backend folder:

server:
- name: [SERVER_NAME]
  command: [SERVER_COMMAND]
  args:
  - [SERVER_ARGS]
[...]

To work on the frontend in browser with the python backend up and running:

./dev.sh frontend-dev

To run the Tauri App in development mode with the python backend:

./dev.sh app-dev

RAG MCP Server

If you would like to use or work on the RAG MCP Server, first create a .env file in the rag folder:

AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create venv and install dependecies:

cd rag
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Then add the following config entry to your config.yaml in your backend folder:

server:
- name: rag
  command: [ABSOLUTE_PATH]/rag/.venv/bin/python3
  args:
  - [ABSOLUTE_PATH]/rag/main.py

Important

Currently only works with Azure OpenAI Service.

Quick Start

1

Clone the repository

git clone https://github.com/mcpflow/localmind
2

Install dependencies

cd localmind
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownermcpflow
Repolocalmind
LanguageTypeScript
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation