
serv4 oh
为 openHAB 提供语义组织和简化数据交换的轻量级 MCP 服务器。
Repository Info
About This Server
为 openHAB 提供语义组织和简化数据交换的轻量级 MCP 服务器。
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
SERV4-OH: Lightweight openHAB MCP Server
SERV4-OH is a lightweight Model Context Protocol (MCP) server for openHAB that focuses on semantic organization and reduced data exchange with Language Models. It provides a streamlined interface between your openHAB smart home system and AI assistants, making it easier and more efficient to control your smart home through natural language.
Note: This project is currently in the early stages of development and is being built in my spare time. However, everything posted on GitHub should be functional. Please report any issues you encounter!
Features
- Semantic Organization: Groups items by semantic meaning (rooms, devices, functions) rather than technical structure
- Reduced Data Exchange: Minimizes the amount of data sent to LLMs by filtering and focusing on relevant information
- Filtering Capabilities: Black and white listing for items based on tags and name patterns
- Fully Configurable: Complete configuration through YAML files without code changes
- Dynamic Tool Creation: Automatically generates tools based on your configuration
- Custom Item Support: Supports items with custom values.
Installation
Prerequisites
- Python 3.10 or higher
- An openHAB server (tested with openHAB 3.x and 4.x)
- kimiconfig python package
Installation Steps
-
Clone the repository:
git clone <repository-url> cd serv4-oh -
Install the package:
pip install .This will create an executable script
serv4-ohin your~/.local/bin/directory. -
Configure the systemd service:
mkdir -p ~/.config/systemd/user/ cp serv4-oh.service ~/.config/systemd/user/ systemctl --user daemon-reload -
Start and enable the service:
systemctl --user enable serv4-oh.service systemctl --user start serv4-oh.service -
Check the service status:
systemctl --user status serv4-oh.service
Configuration
SERV4-OH uses a YAML configuration file located at $XDG_CONFIG_HOME/SERV4-OH/config.yaml.
Create this directory and file if they don't exist:
mkdir -p ~/.config/SERV4-OH
cp ./config.yaml.example ~/.config/SERV4-OH/config.yaml
$EDITOR !:2 # edit 2nd arg from above in def text editor
Basic Configuration
openhab_server:
host: http://your-openhab-server:8080
username: your_username # or .env to use environment variables
password: your_password # or .env to use environment variables
# Logging configuration
logging:
level: "WARNING" # or DEBUG for more detailed logs
Environment Variables
You can use environment variables for sensitive information by creating a .env file in the ~/.config/SERV4-OH/ directory:
OPENHAB_USERNAME=your_username
OPENHAB_PASSWORD=your_password
Item Filtering
You can filter which openHAB items are exposed to the MCP server:
items_filter:
# Black list - exclude items with these tags or name endings
black:
enabled: true
tags:
- NoAI
endings:
- _Machinery
# White list - only include items with these tags or name endings
white:
enabled: false
tags:
- AI
endings: []
Semantic Groups
Define semantic groupings to organize your items in a more natural way and reduce LLM-MCP data exchange.
semantic_groups:
rooms:
tags:
- Rooms
list_tool:
desc: "Fetches list of all rooms."
fields_returned: [name, label]
items_tool:
desc: "Fetches all items for given room."
fields_returned: [name, label, type, state]
terminals:
tags:
- Smartphone
- NetworkAppliance
list_tool:
desc: "Fetches list of all PCs, smartphones, etc."
fields_returned: [name, label]
items_tool:
desc: "Fetches all parameters for given terminal."
fields_returned: [name, label, type, state]
This configuration will create four tools:
list_rooms- Lists all roomsget_rooms_items- Gets all items in a specific roomlist_terminals- Lists all terminals (PCs, smartphones, etc.)get_terminals_items- Gets all items for a specific terminal
Custom Item Types
Define custom item types with specific behaviors:
special_items:
lbm_switch:
type: str
description: "For switches with auto mode (item has tag 'LBM_Switch' or name ends with '_LBM'). One of [ON, OFF, AUTO]."
tags:
- LBM_Switch
endings:
- "_LBM"
possible_values:
- ON
- OFF
- AUTO
Received values are validated against possible_values, LLM gets meaningful answer, if value is incorrect.
Note, that this defines validation and clarifies the capabilities of this item for the LLM, but (of course) does not implement the logic for the item’s response to commands. That logic must be implemented within OpenHAB itself, either through rules or by some other means.
Examples
Example 1: Basic Configuration with Rooms and Lights
semantic_groups:
rooms:
tags:
- Rooms
list_tool:
desc: "Fetches list of all rooms."
fields_returned: [name, label]
items_tool:
desc: "Fetches all items for given room."
fields_returned: [name, label, type, state]
lights:
tags:
- Light
list_tool:
desc: "Fetches list of all lights."
fields_returned: [name, label]
items_tool:
desc: "Fetches all parameters for given light."
fields_returned: [name, label, type, state]
Example 2: Using Custom Item Types
special_items:
autoswitch:
type: str
description: "For switches with auto mode (item has tag 'Autoswitch' or name ends with '_Auto'). Value in ['ON', 'OFF', 'AUTO']."
tags:
- Autoswitch
endings:
- "_Auto"
possible_values:
- ON
- OFF
- AUTO
sysd_service:
type: str
description: "For systemd services (item has tag Systemd). Value - one of ['START', 'STOP', 'RESTART']."
tags:
- Systemd
endings: []
possible_values:
- START
- STOP
- RESTART
Example 3: Interacting with the MCP Server
Intended flow after first user request to LLM is like this:
User → LLM
Add some light to living room.
LLM → SERV4-OH
list_rooms(show=true)
SERV4-OH → LLM
[{'name': 'GF_LR', 'label': 'Living room'}, {…} ]
LLM → SERV4-OH
get_rooms_items(name="GF_LR")
SERV4-OH → LLM
[ {'name': 'GF_LR_Light_1', 'type': 'Switch', 'label': 'Main light', 'state': 'OFF'}, {…} ]
LLM → SERV4-OH
send_command( item_name="GF_LR_Light_1", switch_value="ON" )
SERV4-OH → LLM
{'result': 'ok'}
LLM → User
Done.
Subsequent requests will depend on the model's context size and the conversation trimming settings.
How It Works
- On startup SERV4-OH connects to your openHAB server and retrieves all items, populating some internal static data.
- It creates MCP tools based on your config.
- On LLM request it fetches dynamic data from openHAB, filters items and their fields based on your configuration and returns them.
- On command request it validates value against possible values and sends command to openHAB server.
- On incorrect LLM request it returns error answer with brief explanation.
Future Plans
The following features are planned for future releases:
- More detailed tags integration
- Adding sources for openHAB static data
- Adding prompt templates
- Persistence Integration
- Voice Integration (TTS)
License
MIT License
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Quick Start
Clone the repository
git clone https://github.com/kimifish/serv4-ohInstall dependencies
cd serv4-oh
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.