
llm mcp streamlit
基于 LLaMa MCP Streamlit 的改进项目,使用 Google Maps MCP 和本地 LLM。
Repository Info
About This Server
基于 LLaMa MCP Streamlit 的改进项目,使用 Google Maps MCP 和本地 LLM。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
llm-mcp-streamlit
Overview
This project is mostly a fork of the LLaMa MCP Streamlit project. A few things are different though:
- The MCP server used in this demo has been canged from Playwright MCP to Google Maps MCP.
- The .env file has been altered to use a locally hosted LLM that takes advantage of the OpenAI API.
- Slight change to the mcp_client script to account for way Google Maps lists its tools.
- Drastic shortening of the system prompt.
There are still a few things that I want to work on:
- Containerize the application
- Determine how multiple tools can be used
- Mix Stdio and SSE servers
Running the code
- Clone this repo
- Obtain a Google Maps API Key from this site.
- Rename the .env.example file to .env
- Enter your inference server's API URL, API Key, and Google Maps API Key into the appropriate fileds in the .env file
- Execute the run.sh script or run
poetry run streamlit run llm-mcp-streamlit/main.py
Quick Start
Clone the repository
git clone https://github.com/tedbrunell/llm-mcp-streamlitInstall dependencies
cd llm-mcp-streamlit
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.