
promptcode preset mcp
一个用于管理代码提示文件预设的命令行工具,快速将代码库中的文件集合整理为 AI 提示。
Repository Info
About This Server
一个用于管理代码提示文件预设的命令行工具,快速将代码库中的文件集合整理为 AI 提示。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
promptcode-preset-mcp
A CLI tool for managing file presets for code prompting. Quickly assemble collections of files from your codebase into prompts for AI/LLM interactions.
Installation
Global installation (recommended)
# Install from npm
pnpm install -g @cogflows/promptcode-preset-mcp
# Or install from GitHub
pnpm install -g github:cogflows/promptcode-preset-mcp
Local development
# Clone the repository
git clone https://github.com/cogflows/promptcode-preset-mcp.git
cd promptcode-preset-mcp
# Install dependencies
pnpm install
# Build and link globally
pnpm build
pnpm link --global
Usage
List presets
promptcode ls
Get preset content
# Export to temp file and print path
promptcode get <preset-name>
# Export and open in default editor
promptcode get <preset-name> --open
Set workspace directory
# Use environment variable
WORKSPACE=/path/to/project promptcode ls
WORKSPACE=/path/to/project promptcode get core
Creating Presets
Presets are JSON files stored in .promptcode/presets/ within your workspace directory.
Example preset file .promptcode/presets/core.json:
{
"name": "core",
"files": [
"src/main.ts",
"src/utils.ts",
"README.md"
]
}
Example Workflow
-
Create a presets directory in your project:
mkdir -p .promptcode/presets -
Create preset files:
echo '{ "name": "api", "files": ["src/api/index.ts", "src/api/routes.ts", "src/types.ts"] }' > .promptcode/presets/api.json -
List available presets:
promptcode lsOutput:
[ { "name": "api", "fileCount": 3, "totalTokens": 1250 } ] -
Export preset content:
promptcode get apiOutput:
/var/folders/.../T/preset-api-2025-06-02T19-12-41-238Z.txt -
Use the exported file with your AI tool:
# Copy to clipboard (macOS) cat $(promptcode get api) | pbcopy # Or open directly promptcode get api --open
Features
- Token counting: Uses OpenAI's tiktoken to count tokens in files
- Caching: Token counts are cached for performance
- Workspace support: Work with presets from any project directory
- Clean output: Returns just the file path for easy scripting
Development
The project is written in TypeScript and uses:
- Node.js ES modules
- tiktoken for token counting
- TypeScript for type safety
Project Structure
src/
├── cli.ts # CLI interface
├── presetManager.ts # Preset loading/saving logic
├── tokenCounter.ts # Token counting with caching
└── types/
└── filePreset.ts # TypeScript interfaces
Quick Start
Clone the repository
git clone https://github.com/cogflows/promptcode-preset-mcpInstall dependencies
cd promptcode-preset-mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.