
mcp server llamacloud
A MCP server connecting to managed indexes on LlamaCloud
Repository Info
About This Server
A MCP server connecting to managed indexes on LlamaCloud
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
LlamaCloud MCP Server
A MCP server connecting to multiple managed indexes on LlamaCloud
This is a TypeScript-based MCP server that creates multiple tools, each connected to a specific managed index on LlamaCloud. Each tool is defined through command-line arguments.
Features
Tools
- Creates a separate tool for each index you define
- Each tool provides a
queryparameter to search its specific index - Auto-generates tool names like
get_information_index_namebased on index names
Installation
To use with your MCP Client (e.g. Claude Desktop, Windsurf or Cursor), add the following config to your MCP client config:
The LLAMA_CLOUD_PROJECT_NAME environment variable is optional and defaults to Default if not set.
{
"mcpServers": {
"llamacloud": {
"command": "npx",
"args": [
"-y",
"@llamaindex/mcp-server-llamacloud",
"--index",
"10k-SEC-Tesla",
"--description",
"10k SEC documents from 2023 for Tesla",
"--topK",
"5",
"--index",
"10k-SEC-Apple",
"--description",
"10k SEC documents from 2023 for Apple"
],
"env": {
"LLAMA_CLOUD_API_KEY": "<YOUR_API_KEY>"
}
}
}
}
For Claude, the MCP config can be found at:
- On MacOS:
~/Library/Application Support/Claude/claude_desktop_config.json - On Windows:
%APPDATA%/Claude/claude_desktop_config.json
Tool Definition Format
In the args array of the MCP config, you can define multiple tools by providing pairs of --index and --description arguments. Each pair defines a new tool. You can also optionally specify --topK to limit the number of results.
For example:
--index "10k-SEC-Tesla" --description "10k SEC documents from 2023 for Tesla" --topK 5
Adds a tool for the 10k-SEC-Tesla LlamaCloud index to the MCP server. In this example, it's configured to return the top 5 results.
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To use the development version, replace in your MCP config npx @llamaindex/mcp-server-llamacloud with node ./build/index.js.
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Quick Start
Clone the repository
git clone https://github.com/run-llama/mcp-server-llamacloudInstall dependencies
cd mcp-server-llamacloud
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.