
bedrock mcp streamlit
一个使用 Streamlit 构建的聊天应用,集成 MCP 工具和 Bedrock 模型。
Repository Info
About This Server
一个使用 Streamlit 构建的聊天应用,集成 MCP 工具和 Bedrock 模型。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Bedrock Chat with MCP tool
This is a chat application built with Streamlit and integrated with the MCP (Model Context Protocol) tool.
Overview
Bedrock Chat with MCP tool is a chat application built with Streamlit and integrated with the MCP (Model Context Protocol) tool.
This application uses Langchain and Bedrock to create a chat model and uses the model specified in config.json as a parameter in Langchain's init_chat_model function (https://python.langchain.com/docs/how_to/chat_models_universal_init/). It interacts with the MCP (Model Context Protocol) server defined in mcp_config.json and accesses various tools. MCP is an open protocol that standardizes how applications provide context to LLM (https://modelcontextprotocol.io/). Chat history is stored in a YAML file. The util.py module defines MessageProcessor and its subclasses to handle message processing using different models.
The config.json file allows you to configure the LLM model to use, where the chat history files are stored, etc.
The mcp_config.json file describes the configuration of the MCP server.
In the Streamlit sidebar, you can configure the following:
- Select LLM model
- Change chat history directory
- Change MCP configuration file
- Select past chat history
Features
- Chat interface using Streamlit
- Integration with MCP tools
- Use Langchain and Bedrock for chat model
- LLM model etc. can be configured in
config.json - MCP tool integration
- Use Langchain and Bedrock
- Read MCP server settings from
mcp_config.json - Save chat history in YAML format
- Can change settings from Streamlit sidebar
Setup
-
Install dependencies:
pip install streamlit langchain langchain-aws langchain_mcp_adapters -
Configure MCP server in
mcp_config.json. -
Run the application.
streamlit run src/main.py
Configuration
The config.json file is where you configure the LLM model and other settings.
{
"chat_history_dir": "chat_history",
"mcp_config_file": "mcp_config.json",
"models": {
"Claude 3.7 Sonnet": {
"model_provider": "bedrock_converse",
"model": "us.anthropic.claude-3-7-sonnet-20250219-v1:0"
},
"Amazon Nova Pro": {
"model_provider": "bedrock_converse",
"model": "us.amazon.nova-pro-v1:0"
},
},
}
The mcp_config.json file contains the settings for the MCP server.
Please note that transport is required.
{
"mcpServers": {
"server1": {
"command": "...",
"args": ["..."],
"env": {
"API_KEY": "..."
},
"transport": "..."
}
}
}
Usage
To run the Streamlit application, run the following command.
streamlit run src/main.py
Once the application is running, enter a message in the chat input box. After sending, the chat model and MCP tool will generate a response.
In the sidebar, you can configure the LLM model, chat history directory, and MCP config file. You can also select past chat history and resume the conversation.
Notes
- Write the MCP server configuration in
mcp_config.json. - To use Bedrock, you need an AWS account.
- Chat history is stored in a YAML file.
config.jsonandmcp_config.jsonmust be in the same directory as your application.
Quick Start
Clone the repository
git clone https://github.com/moritalous/bedrock-mcp-streamlitInstall dependencies
cd bedrock-mcp-streamlit
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.