
tome
a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP
Repository Info
About This Server
a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Tome - Magical AI Spellbook
a magical desktop app that lets you chat with local or remote LLMs, schedule hourly or daily tasks, all superpowered by MCP
🔮 Download the Tome Desktop App: Windows | MacOS
Tome
Tome is a desktop app that lets anyone harness the magic of LLMs and MCP. Download Tome, connect any local or remote LLM and hook it up to thousands of MCP servers to create your own magical AI-powered spellbook.
What is MCP? MCP stands for Model Context Protocol and lets your LLM access tools - like search engines, your filesystem, or APIs like Scryfall or Atlassian
🫥 Want it to be 100% local, 100% private? Use Ollama and Qwen3 with only local MCP servers to cast spells in your own pocket universe. ⚡ Want state of the art cloud models with the latest remote MCP servers? You can have that too. It's all up to you!
🏗️ This is a Technical Preview so bear in mind things will be rough around the edges. Join us on Discord to share tips, tricks, and issues you run into. Star this repo to stay on top of updates and feature releases!
🪄 Features
- 🧙 Streamlined Beginner Friendly Experience
- Simply download and install Tome and hook up the LLM of your choice
- No fiddling with JSON, Docker, python or node
- Chat with MCP-powered models within minutes
- 🗓 NEW! Scheduled Tasks
- Schedule prompts to run hourly or at a specific time every day
- Support for any model or MCP servers
- 🤖 AI Model Support
- Remote: Google Gemini, OpenAI, any OpenAI API-compatible endpoint
- Local: Ollama, LM Studio, Cortex, any OpenAI API-compatible endpoint
- 🔮 Enhanced MCP support
- UI to install, remove, turn on/off MCP servers
- npm, uvx, node, python MCP servers all supported out of box
- 🏪 Integration into Smithery.ai registry
- Thousands of MCP servers available via one-click installation
- ✏️ Customization of context windows and temperature
- 🧰 Native support for tool calls and reasoning models
- UI enhancements that clearly delineate tool calls and thinking messages
Demo
https://github.com/user-attachments/assets/0775d100-3eba-4219-9e2f-360a01f28cce
Getting Started
Requirements
- MacOS or Windows (Linux coming soon!)
- LLM Provider of your choice: Ollama or Gemini API key are easy/free
- Download the latest release of Tome
Quickstart
- Install Tome
- Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
- Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste
uvx mcp-server-fetchinto the server field). - Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.
Vision
We want to make local LLMs and MCP accessible to everyone. We're building a tool that allows you to be creative with LLMs, regardless of whether you're an engineer, tinkerer, hobbyist, or anyone in between.
Core Principles
- Tome is local first: You are in control of where your data goes.
- Tome is for everyone: You shouldn't have to manage programming languages, package managers, or json config files.
What's Next
We've gotten a lot of amazing feedback in the last few weeks since releasing Tome but we've got big plans for the future. We want to break LLMs out of their chatbox, and we've got a lot of features coming to help y'all do that.
- Scheduled tasks: LLMs should be doing helpful things even when you're not in front of the computer.
- Native integrations: MCP servers are a great way to access tools and information, but we want to add more powerful integrations to interact with LLMs in unique. ways
- App builder: we believe long term that the best experiences will not be in a chat interface. We have plans to add additional tools that will enable you to create powerful applications and workflows.
- ??? Let us know what you'd like to see! Join our community via the links below, we'd love to hear from you.
Community
Discord Blog Bluesky Twitter
Quick Start
Clone the repository
git clone https://github.com/runebookai/tomeInstall dependencies
cd tome
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.