
local_aider
使用本地大语言模型为数据工程师提供支持的工具。
Repository Info
About This Server
使用本地大语言模型为数据工程师提供支持的工具。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Local AIDER
AI for the Data Engineer using Local LLMs
Setups:
Postgres
docker run --name postgres_db -e POSTGRES_PASSWORD=mysecretpassword -v "$(pwd)":/var/lib/postgresql/data -p 5433:5432 -d postgres
docker exec -it postgres_db bash
su - postgres
psql
\i /var/lib/postgresql/data/Chinook_PostgreSql.sql
Python modules
Apart from usual pip install -r requirements.txt
Install playwright to work with web apps and browsers
playwright install
Very DE specific Remarks
- Good scraping framework with 10k reader tokens Jina-AI
- Brilliant framework for scraping but limited tokens (1200 a month) AgentQL
- CSV text based analysis
- Good ol' Langchain SQLDatabaseToolkit
- However, still hiccups with command syntax based on the SQLLite
- Local models can get crazy when 'overworked', sometimes switching to Groq helps
- Good ol' Langchain SQLDatabaseToolkit
- Visualization
- PandasAI is a good tool, using LiteLLM you can use it for as many calls.
- DB table analysis
- Langchains toolkit can generate catalogs, ERDs in text
- It can also discover the tables on its own
- DB table visualizations
- PandasAI can similarly do the job, but it needs individual table declarations
- PandasAI has good debugging logs that help with understanding the query.
- Observability
- The Langchain ecosystem- Langsmith provides a comprehensive UI for understanding how the LLM fared
- It can be extended to business data as well (but not a good idea)
- Serving
- Again using the Langchain framework (Langserve)
- Leveraging MCP
- Langchain comes with adapters for easy creation of MCP servers and clients
Very local-LLM specific remarks
- My local setup is ollama based, qwen 2.5 has been a reliable LLM
- Langchain and Groq also have their chat platforms to help with using them
- I have a fine machine (M3, 18 GB) that can run the above decently.
- If the machine can't take this workload, try using Groq (you need to get API keys here too)
llm = init_chat_model("Qwen-Qwq-32b", model_provider="groq")
- If you need uncensored models (to understand what AI really thinks of us) you can use dolphin versions of the models.
- It does boil down to the model and prompts.
Very Software Engineering specific remarks
- As with all open source models, version hell is a reality.
- Python 3.11 is the best common-ground Python version to work with these frameworks.
Quick Start
Clone the repository
git clone https://github.com/aastha0304/local_aiderInstall dependencies
cd local_aider
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.