
news_search_mcp
使用本地Ollama模型生成AI新闻播客脚本并提供交互界面。
Repository Info
About This Server
使用本地Ollama模型生成AI新闻播客脚本并提供交互界面。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
AI News Podcast Generator with Ollama Interface
This project consists of two main components:
- MCP AI News Podcast Server: A Flask-based server that generates podcast scripts about AI news using local Ollama models.
- Ollama News Interface: A React application that provides a user interface to interact with both local Ollama models and the MCP AI News Podcast Server.
Project Structure
/news_search_mcp
├── mcp_ai_news_podcast_server/ # Flask server for podcast generation
│ ├── src/ # Server source code
│ │ ├── main.py # Flask app entry point
│ │ ├── news_fetcher.py # News fetching module (conceptual)
│ │ ├── ollama_interactor.py # Ollama API interaction
│ │ └── script_generator.py # Podcast script generation
│ ├── requirements.txt # Python dependencies
│ └── README.md # Server documentation
├── ollama-news-interface/ # React frontend
│ ├── src/ # React source code
│ │ ├── App.js # Main React component
│ │ ├── App.css # Styling
│ │ └── setupProxy.js # Proxy configuration
│ ├── package.json # Node.js dependencies
│ └── README.md # Frontend documentation
├── mcp_server_architecture.md # System architecture document
└── README.md # This file
Prerequisites
- Python 3.10+ for the Flask server
- Node.js and npm for the React application
- Ollama installed and running locally
- curl for API requests
Setup and Running
1. Start the MCP AI News Podcast Server
cd mcp_ai_news_podcast_server
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python src/main.py
The server will start on http://127.0.0.1:5000
2. Start the Ollama News Interface
cd ollama-news-interface
npm install
npm start
The React application will start on http://localhost:3000
How to Use
- Open the React application in your browser at http://localhost:3000
- Select an Ollama model from the dropdown list
- Enter your query in the input text box:
- For general queries, the application will send them directly to the selected Ollama model
- For AI news-related queries (containing keywords like "news", "AI news", etc.), the application will send the request to the MCP AI News Podcast Server
- Click "Submit" to process your query
- The response will appear in the output box below
Features
- Model Selection: Choose from locally installed Ollama models
- Automatic Query Routing: Detects AI news queries and routes them to the podcast server
- Podcast Script Generation: Creates structured podcast scripts about AI news
- Error Handling: Provides clear error messages for troubleshooting
Notes
- The news fetching functionality is conceptual and uses mock data for testing
- In a real scenario, an AI agent would handle the actual news fetching using web search tools
- The podcast server requires Ollama to be installed and running locally
Quick Start
Clone the repository
git clone https://github.com/evinhua/news_search_mcpInstall dependencies
cd news_search_mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.