
file context
File Context is a tool that allows developers to work with their codebase in a more contextual manner. It provides a web interface for exploring, searching, and understanding your code, enhanced with AI capabilities.
Repository Info
About This Server
File Context is a tool that allows developers to work with their codebase in a more contextual manner. It provides a web interface for exploring, searching, and understanding your code, enhanced with AI capabilities.
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
File Context
File Context is a tool that allows developers to work with their codebase in a more contextual manner. It provides a web interface for exploring, searching, and understanding your code, enhanced with AI capabilities. The system is designed to work with multiple AI providers including Ollama, llama.cpp, and Together.ai, giving you flexibility in choosing your preferred AI backend.
Project Overview
The project consists of two main components:
- Client - A frontend application built with React/TypeScript that provides a user interface for interacting with your codebase.
- File Context MCP (Main Control Program) - A backend service that handles file operations, AI processing, and serves the API for the client.
Features
- Interactive file explorer
- Code search and navigation
- AI-powered code understanding and analysis
- Support for multiple AI models including local (Ollama, llama.cpp) and cloud-based models (via Together AI)
Prerequisites
- Docker and Docker Compose
- Node.js (LTS version recommended)
- An API key from Together AI (optional, for cloud-based models)
- Ollama or llama.cpp running locally (optional, for local models)
Quick Start
Using Docker Compose (Recommended)
-
Clone the repository:
git clone https://github.com/yourusername/file-context.git cd file-context -
Create a
.envfile in thefile-context-mcpdirectory:TOGETHER_API_KEY=your_together_api_key # Optional OLLAMA_BASE_URL=http://host.docker.internal:11434 # If using Ollama locally LLAMA_CPP_BASE_URL=http://host.docker.internal:8080 # If using llama.cpp locally MODEL_NAME=llama3.2 # Default model to use -
Start the application using Docker Compose:
docker-compose up -
Access the application in your browser at http://localhost:5173
Manual Setup
If you prefer to run the components separately:
Backend (file-context-mcp)
-
Navigate to the backend directory:
cd file-context-mcp -
Install dependencies:
npm install -
Create a
.envfile with the configuration mentioned above. -
Build and start the server:
npm run build npm start
The API server will start on port 3001 (or as configured in your .env file).
Frontend (client)
-
Navigate to the client directory:
cd client -
Install dependencies:
npm install -
Start the development server:
npm run dev
The client will be available at http://localhost:5173.
Configuration Options
Backend Environment Variables
PORT: Port for the API server (default: 3001)TOGETHER_API_KEY: API key for Together AIOLLAMA_BASE_URL: Base URL for Ollama APILLAMA_CPP_BASE_URL: Base URL for llama.cpp APIMODEL_NAME: Default AI model to use
Frontend Environment Variables
VITE_API_URL: URL of the backend API (default in Docker: http://localhost:3001)
Project Structure
Quick Start
Clone the repository
git clone https://github.com/compiledwithproblems/file-contextInstall dependencies
cd file-context
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.