aufwindmalte
MCP Serveraufwindmaltepublic

portallm

PortaLLM is an open-source bridge between browser-based AI chat tools (like ChatGPT, Claude, or Gemini) and your local system — allowing intelligent conversations to trigger real actions on your machine, all without API costs or cloud dependency.

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
JavaScript
Language
GNU General Public License v3.0
License

About This Server

PortaLLM is an open-source bridge between browser-based AI chat tools (like ChatGPT, Claude, or Gemini) and your local system — allowing intelligent conversations to trigger real actions on your machine, all without API costs or cloud dependency.

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

PortaLLM – Local AI-to-System Bridge

PortaLLM is an open-source bridge between browser-based AI chat tools (like ChatGPT, Claude, or Gemini) and your local machine. It enables intelligent conversations to trigger real-world actions — like file operations, document creation, app automation, or scripting — without API tokens, cloud lock-in, or vendor control.

Talk to your AI. Let your system respond.


Why PortaLLM?

Language models are powerful reasoning tools — but in the browser, they are disconnected from your actual system. PortaLLM closes this gap.

With a simple browser extension and a local server, you can:

  • Use GPT to generate files or rename folders
  • Trigger automation (e.g., create a Word doc, sort PDFs)
  • Let LLMs guide command-line tasks
  • Access your local filesystem securely and privately
  • Keep everything offline, open, and under your control

How It Works

Browser LLM (ChatGPT, Claude)
    |
[ Extension ]
    |
Detected AI responses (text)
    |
Local MCP API (Flask server on localhost)
    |
Local actions (file ops, Office, scripts)

Project Structure

portallm/
├── extension/        # Chrome/Brave extension (Manifest V3, content + background scripts)
├── server/           # Flask-based local MCP endpoint
├── docs/             # Project documentation
├── examples/         # Sample JSON payloads and GPT prompts
├── tests/            # Optional test scripts
└── LICENSE           # GPL v3 license

Features

  • Local-first: All logic runs on your machine
  • Dynamic GPT-to-system connection: From conversation to automation
  • No tokens. No vendor billing. No surveillance
  • Modular design: Easily add more system capabilities
  • Works with any browser-based LLM

Setup Guide

1. Start the local server (Flask)

cd server
pip install flask
python app.py

This will start a local endpoint at: http://localhost:5000/mcp-hook


2. Load the browser extension

  1. Open chrome://extensions (Chrome or Brave)
  2. Enable "Developer mode"
  3. Click "Load unpacked"
  4. Select the extension/ folder

The extension will detect new LLM responses and send them to your local server.


3. Example Use Case

ChatGPT outputs:

Please rename the following files alphabetically:
- delta.txt
- alpha.txt
- beta.txt

→ PortaLLM extension captures it → Sends it to localhost:5000/mcp-hook → Your MCP server executes the logic on your machine


Philosophy

This project is for developers, tinkerers, system nerds, and AI explorers who want real agency over their tools.

PortaLLM is released under the GNU GPL v3 to ensure:

  • Transparency
  • Open contributions
  • Resistance to proprietary capture

We believe automation should be yours — not rented.


Contributing

We welcome developers, testers, writers, and rebels.

To get involved:

  • Fork the repo
  • Check out CONTRIBUTING.md
  • Open issues or submit pull requests
  • Share new automation modules or use cases!

Inspiration & Credits

  • Inspired by the limitations of token-billed APIs
  • Motivated by the power of Claude, ChatGPT, and open LLMs
  • Influenced by the ethos of the CCC, FSF, and hacker communities

License

This project is licensed under the GNU General Public License v3.0 See LICENSE for details.

Quick Start

1

Clone the repository

git clone https://github.com/aufwindmalte/portallm
2

Install dependencies

cd portallm
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Owneraufwindmalte
Repoportallm
LanguageJavaScript
LicenseGNU General Public License v3.0
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation