
calculator_mcp
MCP tool for calculator
Repository Info
About This Server
MCP tool for calculator
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Calculator MCP Server
A simple MCP server exposing basic arithmetic (add, subtract, multiply, divide), mathematical expression evaluation, statistical (mean, median, mode, standard deviation, variance), and calculus (numerical integration, differentiation) operations as ADK FunctionTools over stdio and SSE.
Prerequisites
- Python 3.13+
- uv
Installation
-
Clone or download this repository.
-
Create virtual environment
uv venv source .venv/bin/activate -
In a terminal, install dependencies:
uv sync
Running the Server
Start the MCP server on standard I/O:
via STDIO
uv run stdio_server.py
via SSE
uv run sse_server.py
You should see:
Launching Calculator MCP Server...
Running the client
- open new terminal tab,
source .venv/bin/activateto activate the env cd path/to/your/calculator_agent_litellm- create
.env, add key and adjust model appropriately 3a. eg.GEMINI_API_KEYorMY_OPENAI_API_KEYwithGOOGLE_GENAI_USE_VERTEXAI=false3b. for all available models, please refer to https://docs.litellm.ai/docs/providers/gemini - run the server
cd path/to/your/calculator_agent_litellm
adk web ../
Using from an ADK Agent
Below is an example Python client that connects to this MCP server and invokes calculator operations:
import asyncio
import json
from google.adk.tools.mcp_tool.mcp import MCPToolset
from google.adk.tools.mcp_tool.mcp import StdioServerParameters
from google.adk.agent.tool_runner import Runner
async def main():
# Launch and connect to the server
tools, stack = await MCPToolset.from_server(
connection_params=StdioServerParameters(
command="python3",
args=["/absolute/path/to/stdio_server.py"]
)
)
# Use the Runner to call a tool
runner = Runner(tools)
result = await runner.call_tool("add", {"a": 5, "b": 3})
print(json.loads(result.text)["result"]) # prints 8
# Close the connection
await stack.aclose()
if __name__ == "__main__":
asyncio.run(main())
Replace /absolute/path/to/stdio_server.py with the correct path on your system.
Listing Available Tools
You can also list available tools using any MCP-capable client. In Python:
tools, stack = await MCPToolset.from_server(
connection_params=StdioServerParameters(
command="python3",
args=["/absolute/path/to/stdio_server.py"]
)
)
print([t.name for t in tools]) # ["add", "subtract", "multiply", "divide"]
await stack.aclose()
Cline MCP Configuration
If using Cline MCP, add the following configuration:
{
"mcpServers": {
"calculator": {
"type": "stdio",
"command": "uv",
"args": [
"--directory",
"<absolute_path>/calculator_mcp/",
"run",
"stdio_server.py"
]
}
}
}
Using calculator_agent with Gemini (via adk web SSE Server)
This section explains how to connect the calculator_agent (configured for Gemini) to the SSE-based MCP server using adk web.
-
Create a new folder for your ADK agent, e.g.,
calculator_agent. (This folder should already exist if you cloned the repository). -
Ensure
calculator_agent/agent.pyis configured for Gemini (e.g.,model="gemini-2.0-flash"). -
Create or verify the
.envfile inside thecalculator_agentfolder with the following content for Gemini:GOOGLE_GENAI_USE_VERTEXAI=FALSE GOOGLE_API_KEY=PASTE_YOUR_ACTUAL_API_KEY_HERENote: Replace
PASTE_YOUR_ACTUAL_API_KEY_HEREwith your actual Google API key. -
Start the SSE server from the
calculator_mcproot directory:uv run sse_server.py -
In another terminal, navigate to your
calculator_agentfolder and runadk webpointing to its parent directory:cd path/to/your/calculator_agent adk web ../For example, if
calculator_agentis insidecalculator_mcp, and you are incalculator_mcp/calculator_agent, you would runadk web ../. Ifcalculator_agentis a sibling tocalculator_mcp, you might runadk web ../calculator_agentfrom withincalculator_mcpor adjust paths accordingly. The key is thatadk webneeds to find the agent's directory.
Your browser will open the ADK web UI, allowing you to interact with the calculator tools using the Gemini-powered calculator_agent.
Using calculator_agent_litellm with Other LLMs (e.g., OpenAI)
This section describes how to run the calculator_agent_litellm example agent using non-Gemini models like OpenAI, leveraging LiteLLM.
Configuring calculator_agent_litellm for OpenAI
The calculator_agent_litellm uses LiteLLM, which simplifies using various LLM providers, including OpenAI.
-
Set Environment Variables for OpenAI: Create or update a
.envfile in the root of yourcalculator_agent_litellmproject folder. For OpenAI, you primarily need to set:# Required: Your OpenAI API key OPENAI_API_KEY="your_openai_api_key_here" # Optional: LiteLLM allows specifying the model directly in the agent code # (as done in the example agent.py: model="openai/gpt-4.1-nano") # or via other environment variables if your agent code is set up to read them. # If agent.py uses a generic "MODEL" env var for LiteLLM model string: # MODEL="openai/gpt-4"Replace
your_openai_api_key_herewith your actual OpenAI API key. LiteLLM will automatically use this for OpenAI calls if the model string inagent.py(e.g.,openai/gpt-4.1-nano) indicates an OpenAI model. -
Ensure Agent Code Specifies an OpenAI Model: Verify that your
agent.pywithincalculator_agent_litellmis configured to use an OpenAI model string recognized by LiteLLM (e.g.,"openai/o4-mini","gpt-4.1-nano"if it's an OpenAI model). The example usesmodel="openai/gpt-4.1-nano".# In calculator_agent_litellm/agent.py # root_agent = LlmAgent( # model=LiteLlm( # model="openai/gpt-4.1-nano", # This specifies an OpenAI model via LiteLLM # api_key=os.getenv("MY_OPENAI_API_KEY"), # Ensure this matches your .env key if different # ), # ... # )Note: The example
agent.pyforcalculator_agent_litellmusesos.getenv("MY_OPENAI_API_KEY"). Ensure your.envfile forcalculator_agent_litellmusesMY_OPENAI_API_KEYor update theagent.pyto useOPENAI_API_KEY. For consistency with common practices, usingOPENAI_API_KEYin both the.envfile andos.getenv()is recommended. -
Start the SSE Server (if not already running): In your
calculator_mcpproject root:uv run sse_server.py -
Run the ADK Web UI for
calculator_agent_litellm: In another terminal, navigate to yourcalculator_agent_litellmfolder and runadk webpointing to its parent directory or the agent directory itself as appropriate:cd path/to/your/calculator_agent_litellm adk web ../The agent will use LiteLLM to proxy requests to the specified OpenAI model using your API key.
Running Tests
This project uses pytest for testing.
-
From the project root directory (
calculator_mcp), run:uv run pytest testsThis will discover and run all tests in the
tests/test_calculator.pyfile.
Demo Videos
Here are three demonstration videos showing different ways to use the Calculator MCP Server:
1. Basic Addition Operation
https://github.com/user-attachments/assets/e65a9074-2bed-4647-af80-97d8c193e90b
This video demonstrates the basic addition operation using the MCP server, showing how to add two numbers together.
2. Statistical Mean Calculation
https://github.com/user-attachments/assets/0a9d7410-b5bf-465b-8d61-7dab1d5f1701
Watch how to calculate the statistical mean of a dataset using the calculator's statistical functions.
3. Mathematical Expression Evaluation
https://github.com/user-attachments/assets/6939b92a-0d66-4205-aea7-291114a258f3
Learn how to evaluate complex mathematical expressions using the calculator's expression evaluation capabilities.
References
-
MCP SSE example:
- https://github.com/google/adk-python/blob/main/contributing/samples/mcp_sse_agent/filesystem_server.py
-
MCP STDIO example:
- https://github.com/google/adk-python/blob/main/contributing/samples/mcp_stdio_server_agent/agent.py
-
Setting up runtime keys
- https://docs.litellm.ai/docs/set_keys
Quick Start
Clone the repository
git clone https://github.com/kimyu-ng/calculator_mcpInstall dependencies
cd calculator_mcp
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.