the-wc
MCP Serverthe-wcpublic

llama cloak

simple ollama gui

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
TypeScript
Language
MIT License
License

About This Server

simple ollama gui

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Contributors Forks Stargazers Issues project_license LinkedIn


Logo

llama-cloak

a simple ui for local llama sessions


View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
    • Built With
  2. Getting Started
    • Prerequisites
    • Installation
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Home

llama-cloak is a simple gui to interact with your local Llama instance.

(back to top)

Built With

  • Reactrouter
  • React
  • Tailwindcss

(back to top)

Getting Started

This is a react-router v7 app. To run this locally, you just need npm and llama3.2.

Prerequisites

This is an example of how to list things you need to use the software and how to install them.

  • run llama locally
    ollama run llama3.2
    

Installation

  1. Clone the repo
    git clone https://github.com/the-wc/llama-cloak.git
    
  2. Install NPM packages
    npm install
    
  3. Start using llama-cloak at http://localhost:5173.

(back to top)

Usage

Type a message into the chat interface and you'll get a response.

(back to top)

Roadmap

I threw this together over a couple hours, so you should just clone/fork it and make your own changes. Some ideas:

  • Handle optimistic UI with a prompt
  • Store chats uniquely
  • GPU temp/mem monitor
  • Online status ping for checking local llama instance
  • Configuration for advanced use cases
  • Agentic/MCP support

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

Top contributors:

contrib.rocks image

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Will - @dubkaycee

Project Link: https://github.com/the-wc/llama-cloak

(back to top)

Acknowledgments

  • Ollama
  • remix-utils by sergiodxa

(back to top)

Quick Start

1

Clone the repository

git clone https://github.com/the-wc/llama-cloak
2

Install dependencies

cd llama-cloak
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerthe-wc
Repollama-cloak
LanguageTypeScript
LicenseMIT License
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation