irfankabir02
MCP Serverirfankabir02public

ollama_starter

一个模块化的本地优先 AI 助手系统,完全离线运行,提供多角色交互和符号化协议。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Python
Language
-
License

About This Server

一个模块化的本地优先 AI 助手系统,完全离线运行,提供多角色交互和符号化协议。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

🌌 QuietEdge: Symbolic AI Assistant (Offline, Local-First)

QuietEdge is a modular, local-first AI assistant system powered by Ollama. It runs entirely offline with no cloud dependencies, offering symbolic, multi-persona interaction through a Gradio web UI and a streaming terminal interface.

  • 🔒 Privacy-first: All interactions stay on your machine
  • 🔧 Modular tools: Note-taking, summarization, search (extensible)
  • 🧠 Multiple personas: Define styles, memory scope, tools per agent
  • 🌀 Symbolic protocol: Context-aware routing through the MCP layer
  • Simple and elegant: Quiet by design, with an intuitive interface

📁 Project Structure

ollama_starter/
├── main/
│   ├── mcp.py                  # Context coordination layer
│   ├── memory.py               # Long-term and persona memory
│   ├── ollama_assistant.py     # Ollama local model interface
│   ├── personas.py             # Define and switch AI personas
│   └── tools/                  # Modular symbolic tools
├── scripts/
│   ├── stream_terminal_chat.py # Terminal-based streaming chat
│   ├── launch_web_ui.py        # Gradio UI launcher
│   └── clean_structure.sh      # Project cleanup automation
├── web/
│   ├── interface.py            # Gradio logic
│   ├── ui.py                   # Stream handling and formatting
├── README.md
└── requirements.txt

⚙️ Setup Instructions

1. Install Dependencies

# Create and activate your environment
python3 -m venv venv
source venv/bin/activate

# Install dependencies
pip install -r requirements.txt

# Install Ollama if not installed
curl -fsSL https://ollama.com/install.sh | sh

2. Start Ollama (locally)

ollama serve
ollama run mistral  # Or your preferred model (e.g., llama3)

🚀 Usage

🌐 Launch Web UI (Gradio)

python scripts/launch_web_ui.py

Features:

  • Persona switching
  • Markdown + streaming output
  • Interactive tools (note-taking, summarization, search)
  • Tag-based prompt parsing (@note, @summarize, etc.)

💻 Stream Terminal Chat

python scripts/stream_terminal_chat.py

Use markdown, tags, or CLI-friendly prompts like:

::note summarize today’s meeting
::search how to build local LLM assistant

🧩 Features

  • ✅ Symbolic tool system with @tags and command triggers
  • ✅ Multi-persona support (memory, tone, tool access)
  • ✅ Streaming support (both UI and terminal)
  • ✅ Modular MCP for intelligent input routing
  • ✅ Extensible memory backend (JSON, SQLite optional)
  • ✅ Minimal dependencies, low RAM usage

🤝 Contributing

Pull requests welcome!

# Fork and clone
git clone https://github.com/yourname/ollama_starter.git
cd ollama_starter

# Make your changes on a new branch
git checkout -b feature/my-improvement

To submit:

  1. Follow clean Python structure
  2. Keep symbolic clarity and naming consistency
  3. Test main/ logic before UI calls it

🔐 Philosophy

QuietEdge is designed for thinkers, artists, and builders who want powerful AI tools without noise, cloud, or distraction. Each persona reflects a facet of intelligence. The system respects autonomy, creativity, and clarity.


📜 License

MIT — free to use, remix, build.


Made with 🛠️ in a quiet room, far from the cloud.

Quick Start

1

Clone the repository

git clone https://github.com/irfankabir02/ollama_starter
2

Install dependencies

cd ollama_starter
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerirfankabir02
Repoollama_starter
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation