
ollama mcp
bridging the gap between ollama and MCP server
Repository Info
About This Server
bridging the gap between ollama and MCP server
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Ollama + MCP: Local AI Research Assistant
Ever wanted to chat with your local AI about research papers without sending data to external APIs? This project lets you do exactly that using Ollama and the Model Context Protocol (MCP).
What it does
I built this to learn more about how MCP works and how to get it working with Ollama. Turns out it's also pretty useful for research - now I can just ask my local AI model to find papers, summarize them, and keep everything organized automatically.
The setup includes:
- Custom ArXiv research server - searches and organizes academic papers (built following an MCP course)
- File system access - using the official MCP filesystem server
- Web scraping - using the official MCP fetch server
- MCP client that works with Ollama - the tricky integration part I figured out
Quick start
You'll need Python 3.13+, UV package manager, and Ollama running locally.
# Get the code
git clone https://github.com/career-genomics/ollama-mcp.git
cd ollama-mcp
# Set up environment
uv venv && source .venv/bin/activate
uv pip install -e .
# Make sure Ollama is running
ollama serve
ollama pull qwen3:14b # or qwen3:8b for lighter usage
How to use it
Full MCP experience (recommended)
python src/mcp_client/mcp_chatbot.py
Then you can:
- Ask: "Find papers about quantum machine learning"
- Use
@foldersto see your research topics - Use
@quantum_machine_learningto browse saved papers - Regular research questions work great too
Simple chatbot (no MCP setup needed)
python src/chat-bot/chatbot-loop.py
Just the ArXiv server
python main.py
What makes it useful
The AI can actually do things, not just chat:
- Search ArXiv and save papers organized by topic
- Remember what it found in previous sessions
- Fetch web content and analyze it
- Work with your local files
All your data stays local. The AI runs on your machine through Ollama, and research papers get saved in a simple papers/ folder structure.
Configuration
Want to use a different model? Edit the model name in src/mcp_client/mcp_chatbot.py:
chatbot = MCPChatBot(model_name="qwen3:8b") # or llama2, etc.
The MCP servers are configured in src/configs/server_config.json - you can add or remove services there.
Example workflow
You: "Find recent papers on transformer improvements"
AI: [searches ArXiv, finds 5 papers, saves them locally]
"I found 5 recent papers on transformer improvements. Here's what I discovered..."
[provides detailed analysis]
You: "@folders"
AI: Shows: transformer_improvements
You: "@transformer_improvements"
AI: [displays organized information about all saved papers]
Project structure
src/
mcp_client/mcp_chatbot.py # Full-featured MCP client
mcp_servers/arxiv_research/ # ArXiv search server
chat-bot/chatbot-loop.py # Simple standalone version
configs/server_config.json # MCP server settings
papers/ # Where research gets saved
Common issues
Can't connect to Ollama? Make sure it's running with ollama serve
MCP servers not working? Check that all dependencies installed with uv pip install -e .
Out of memory? Try a smaller model like qwen3:8b instead of the 14B version
Why I built this
I couldn't find a consistent, working way to run MCP servers with Ollama. Most examples online are either incomplete, outdated, or assume you're using Claude/OpenAI APIs.
The official MCP servers (filesystem, fetch, etc.) work great, but connecting them to Ollama models was the missing piece. After digging through docs and experimenting, I figured out how to make the integration work reliably - handling tool calling format conversion, message flow, and multi-server coordination.
I also built a custom ArXiv research server since I was tired of manually tracking papers. Now it's all automated and stays local.
Contributing
Found a bug or want to add a feature? Pull requests welcome! The code is pretty straightforward to understand and modify.
The filesystem and fetch servers come from the official MCP servers repository - the main contribution here is the Ollama integration layer and custom ArXiv research server.
License
Apache 2.0 - use it however you want.
Built because research should be easier, not harder.
Quick Start
Clone the repository
git clone https://github.com/career-genomics/ollama-mcpInstall dependencies
cd ollama-mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.