
sample smolagents
This project demonstrates the use of smolagents from Hugging Face to create AI agents.
Repository Info
About This Server
This project demonstrates the use of smolagents from Hugging Face to create AI agents.
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
Sample Usage: Smolagents
[TOC]
Overview
This project demonstrates the use of smolagents from Hugging Face to create AI agents. The samples were created on Windows and will create files in the C:\\tmp directory.
Requirements
- Python >= 3.13
- Ollama
- Hugging Face Token
Running the Ollama Server
To start the Ollama server, execute the following command:
ollama serve
The server listens on port 11434 by default.
Running the Qwen2.5 Model with Ollama
While the Ollama server is running, download the model using the following command:
ollama run qwen2.5
This only needs to be done once.
Note: This is a small agent and may get confused easily. For a more robust implementation, use a model with more parameters.
The .env File
Create a .env file with the following content:
HF_TOKEN=<HUGGING_FACE_TOKEN>
Replace <HUGGING_FACE_TOKEN> with your actual Hugging Face token. This is required because all the samples use the Inference API with the following model by default:
Qwen/QwQ-32B
Switch to the Ollama model if needed by changing the
active_modeltolocal_modelin the model.ModelMgr module.
Installation
-
Clone the repository:
git clone https://github.com/rcw3bb/sample-smolagents.git cd sample-smolagents -
Install the dependencies:
If Poetry is not yet installed, use the following command to install it:
python -m pip install poetryAfter installation, make
poetryavailable to theCLIby updating thePATHenvironment variable to include the following if you are using Windows:%LOCALAPPDATA%\Programs\Python\Python313\ScriptsIf your system Python version is lower than Python 3.13, use the following command to install it:
poetry python install 3.13poetry install
Non-MCP Samples
Simple File Management with AI Agent
poetry run python -m sample.simple.file_management_sample
Observe whether it uses the provided tools.
Simple File Management with AI Manager Agent
poetry run python -m sample.simple.file_management_managed_sample
Observe whether it uses the provided tools.
MCP Stdio Server Samples
Simple File Management with AI Agent
poetry run python -m sample.mcp.stdio.file_management_sample
Observe whether it uses the provided tools.
Simple File Management with AI Manager Agent
poetry run python -m sample.mcp.stdio.file_management_managed_sample
Observe whether it uses the provided tools.
Using Just the MCP Server
poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_stdio
Where <ROOT_DIR> is the directory that contains the mcp_servers directory.
Use the following prompt to test the server:
Write the text "Hello, World!" to "C:/tmp/mcp-stdio-hello.txt" and show me its content.
Expect to see that the write_file and read_file tools were utilized.
MCP SSE Server Samples
Starting the MCP SSE Server
The server must be running before running any sample from this section.
poetry run python -m mcp_servers.file_manager_server_sse
Simple File Management with AI Agent
poetry run python -m sample.mcp.sse.file_management_sample
Observe whether it uses the provided tools.
Simple File Management with AI Manager Agent
poetry run python -m sample.mcp.sse.file_management_managed_sample
Observe whether it uses the provided tools.
Using Just the MCP Server
-
Run the server using the following command:
poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_sseWhere <ROOT_DIR> is the directory that contains the
mcp_serversdirectory. This will run a server on port8000. -
Use the following address to attach to an agent:
http://localhost:8000/sse
Use the following prompt to test the server:
Write the text "Hello, World!" to "C:/tmp/mcp-sse-hello.txt" and show me its content.
Expect to see that the write_file and read_file tools were utilized.
Author
Ronaldo Webb
Quick Start
Clone the repository
git clone https://github.com/rcw3bb/sample-smolagentsInstall dependencies
cd sample-smolagents
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.