
mcp_project
使用LLMs、Azure OpenAI和arXiv API构建的终端研究助手。
Repository Info
About This Server
使用LLMs、Azure OpenAI和arXiv API构建的终端研究助手。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP-Powered Research Assistant 🧠📚
This project showcases how to build a terminal-based research assistant using Large Language Models (LLMs), Azure OpenAI, and the arXiv API. It includes two progressively enhanced implementations:
- Baseline version using traditional API calls
- Advanced version using the Model Context Protocol (MCP) via FastMCP
📖 Related Articles
🧰 Building a Terminal-Based Research Assistant Using Azure OpenAI and arXiv
Introduces a basic, step-by-step implementation using arXiv and Azure OpenAI APIs without MCP. Ideal for beginners.
🛠 From API Calls to MCP: Developing an LLM-Powered Research Assistant
Extends the previous implementation by integrating FastMCP for a structured and agent-ready architecture. This article explains the benefits of MCP and showcases tool chaining.
🧠 Expanding FastMCP with Prompts and Resources for Smarter Research Assistants
Completes the series by showing how to go beyond tools, introducing prompt templates and structured resources to enable memory, reasoning, and more advanced agent workflows.
📁 Project Structure
mcp_project/
├── research_server.py # MCP server with arXiv + Azure OpenAI tools/resources
├── mcp_chatbot.py # Client interface to interact with the MCP server
├── run_mcp_chatbot.sh # Shell script to set up and run the chatbot
├── .env # Environment variables (not committed)
├── .gitignore # Git ignore rules
├── pyproject.toml # Dependency configuration for uv
├── papers/ # Local paper metadata cache
└── README.md # This file
📝 About research_server.py
research_server.py is the core backend that exposes research tools and resources via the Model Context Protocol (MCP) using FastMCP. It enables structured, agent-compatible access to academic research workflows.
🔧 Tools
Tools are functions decorated with @mcp.tool() that can be called by LLM agents or clients. Each tool encapsulates a specific research action:
-
search_papers(topic, max_results):
Searches arXiv for papers on a topic, saves metadata locally, and returns a list of paper IDs. -
extract_info(paper_id):
Retrieves metadata for a specific paper from local storage. -
summarize_paper(text):
Uses Azure OpenAI to summarize a research paper or text in plain English. -
get_full_text(paper_id):
Downloads and extracts the full text from a paper’s PDF using PyMuPDF. -
list_all_papers():
Lists all downloaded paper IDs, grouped by topic.
📦 Resources
Resources, decorated with @mcp.resource(), provide structured, navigable data for agents:
-
papers://folders
Lists all available topic folders in the localpapers/directory, allowing agents to browse topics. -
papers://{topic}
Shows a concise, human-readable summary of all papers under a specific topic, including titles, authors, publication dates, and short summaries.
💡 Prompts
Prompts, decorated with @mcp.prompt(), generate structured instructions for LLM agents:
- generate_search_prompt(topic, num_papers):
Produces a multi-step prompt guiding the agent to:- Search for papers on a topic.
- Extract and summarize key details from each paper.
- Synthesize an overview of the research landscape, including trends, gaps, and notable works.
- Format the output with clear headings and bullet points for readability.
🛠️ How it works
- The server registers all tools, resources, and prompts with FastMCP.
- Clients (like
mcp_chatbot.py) interact with the server using MCP, invoking tools and navigating resources as needed. - The design supports tool chaining and structured workflows, making it easy to extend or integrate with other agent frameworks.
📝 About mcp_chatbot.py
mcp_chatbot.py is an interactive terminal-based client for the MCP-powered research assistant. It connects to the MCP server (research_server.py) and allows users to discover, invoke, and chain research tools, browse resources, and use structured prompts—all from the command line.
🚀 Features
-
Interactive Chat Loop:
Type natural language queries or special commands to interact with the assistant. -
Tool Discovery & Invocation:
List all available tools and invoke them directly or as part of a workflow. -
Resource Browsing:
List and view structured resources (e.g., available topics, paper summaries) exposed by the server. -
Prompt Generation & Execution:
List available prompts, generate structured research instructions, and execute them with arguments. -
Tool Chaining:
Automatically chains tools (e.g., search → extract → summarize) for streamlined research tasks. -
Azure OpenAI Integration:
Uses Azure OpenAI for language understanding and summarization.
🚀 How to Run (MCP Version)
-
Clone the repository:
git clone https://github.com/teeratornk/mcp_project.git cd mcp_project -
Create a .env file with your Azure OpenAI credentials:
AZURE_OPENAI_ENDPOINT=your-endpoint-url AZURE_OPENAI_MODEL=your-deployment-name AZURE_OPENAI_API_KEY=your-api-key
-
Run the chatbot using the provided script:
bash run_mcp_chatbot.sh
This will:
- Initialize the project with uv
- Set up a virtual environment
- Install required packages if missing
- Launch the chatbot client
📌 Highlights
- Modular Design with FastMCP tools
- Tool Chaining: search_papers → extract_info → summarize_paper
- Local Caching of metadata for efficient repeat queries
- Declarative Tool Use compatible with structured agent frameworks
💬 Interactive Commands
Within the chatbot, you can use:
list tools– Show all available toolslist resources– Show all available resourceslist prompts– Show all available promptspapers://<topic>– Access a specific topic resourcerun prompt <prompt_name> [arg1=val1 arg2=val2 ...]– Generate and execute a prompt with arguments<any query>– Let the assistant process your question using LLM and available toolshelp– Show available commandsquit– Exit the chatbot
📬 Contact Developed by Teeratorn Kadeethum (Meen)
Feel free to open an issue or reach out if you'd like to collaborate!
Quick Start
Clone the repository
git clone https://github.com/teeratornk/mcp_projectInstall dependencies
cd mcp_project
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.