
mcp_starter
How to setup mcp server and mcp client.
Repository Info
About This Server
How to setup mcp server and mcp client.
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
MCP Starter Project
What is MCP?
The Model Context Protocol (MCP) is a standard for building AI applications that can interact with external tools and APIs. It consists of two main components:
- MCP Server: A Python service that defines and exposes tools/functions that can be called by AI models
- MCP Client: A TypeScript/JavaScript client that connects to the MCP server and manages interactions between AI models and tools
Project Structure
mcp_starter/
├── mcp-server/ # Python MCP server implementation
│ ├── main.py # Server with documentation search tool
│ └── pyproject.toml # Python dependencies
└── mcp-clients/ # TypeScript MCP client implementation
├── index.ts # Express server with HuggingFace integration
└── package.json # Node.js dependencies
Getting Started
Prerequisites
- Python 3.11 or higher
- Node.js 18 or higher
- Hugging Face API key
- Serper API key for Google Search functionality
Setting Up the Server
- Create a Python virtual environment and activate it:
cd mcp-server
python -m venv .venv
# On Windows
.venv\Scripts\activate
- Install dependencies:
pip install -e .
- Create a
.envfile in themcp-serverdirectory:
SERPER_API_KEY=your_serper_api_key_here
Setting Up the Client
- Install Node.js dependencies:
cd mcp-clients
npm install
- Create a
.envfile in themcp-clientsdirectory:
HUGGINGFACE_API_KEY=your_huggingface_api_key_here
- Build the TypeScript code:
npm run build
Running the Application
- Start the MCP server:
cd mcp-server
python main.py
- In a new terminal, start the client server:
cd mcp-clients
node build/index.js ../mcp-server/main.py
Using the API
The client exposes two endpoints:
- Health Check:
GET http://localhost:3000/health - Chat:
POST http://localhost:3000/chat
Example chat request:
{
"query": "Search the langchain docs for RAG",
"sessionId": "user123"
}
Features
-
Documentation Search Tool: Search documentation for popular AI libraries:
- LangChain
- LlamaIndex
- OpenAI
-
Conversation Management: Maintains chat history per session
-
Tool Integration: Seamlessly integrates AI model responses with tool calls
-
Error Handling: Robust error handling for API calls and tool execution
How It Works
- The MCP server defines tools that can be called by AI models
- The client connects to the MCP server and retrieves available tools
- When a user sends a query:
- The client formats the conversation history
- Sends it to the Hugging Face model
- Extracts and executes tool calls from the model's response
- Returns the final response including tool results
Environment Variables
Server
SERPER_API_KEY: API key for Google Search functionality
Client
HUGGINGFACE_API_KEY: API key for accessing Hugging Face models
License
MIT License
Quick Start
Clone the repository
git clone https://github.com/sharmatriloknath/mcp_starterInstall dependencies
cd mcp_starter
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.