rcw3bb
MCP Serverrcw3bbpublic

sample smolagents

This project demonstrates the use of smolagents from Hugging Face to create AI agents.

Repository Info

1
Stars
0
Forks
1
Watchers
0
Issues
Python
Language
-
License

About This Server

This project demonstrates the use of smolagents from Hugging Face to create AI agents.

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Sample Usage: Smolagents

[TOC]

Overview

This project demonstrates the use of smolagents from Hugging Face to create AI agents. The samples were created on Windows and will create files in the C:\\tmp directory.

Requirements

  • Python >= 3.13
  • Ollama
  • Hugging Face Token

Running the Ollama Server

To start the Ollama server, execute the following command:

ollama serve

The server listens on port 11434 by default.

Running the Qwen2.5 Model with Ollama

While the Ollama server is running, download the model using the following command:

ollama run qwen2.5

This only needs to be done once.

Note: This is a small agent and may get confused easily. For a more robust implementation, use a model with more parameters.

The .env File

Create a .env file with the following content:

HF_TOKEN=<HUGGING_FACE_TOKEN>

Replace <HUGGING_FACE_TOKEN> with your actual Hugging Face token. This is required because all the samples use the Inference API with the following model by default:

Qwen/QwQ-32B

Switch to the Ollama model if needed by changing the active_model to local_model in the model.ModelMgr module.

Installation

  1. Clone the repository:

    git clone https://github.com/rcw3bb/sample-smolagents.git
    cd sample-smolagents
    
  2. Install the dependencies:

    If Poetry is not yet installed, use the following command to install it:

    python -m pip install poetry
    

    After installation, make poetry available to the CLI by updating the PATH environment variable to include the following if you are using Windows:

    %LOCALAPPDATA%\Programs\Python\Python313\Scripts
    

    If your system Python version is lower than Python 3.13, use the following command to install it:

    poetry python install 3.13
    
    poetry install
    

Non-MCP Samples

Simple File Management with AI Agent

poetry run python -m sample.simple.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.simple.file_management_managed_sample

Observe whether it uses the provided tools.

MCP Stdio Server Samples

Simple File Management with AI Agent

poetry run python -m sample.mcp.stdio.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.mcp.stdio.file_management_managed_sample

Observe whether it uses the provided tools.

Using Just the MCP Server

poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_stdio

Where <ROOT_DIR> is the directory that contains the mcp_servers directory.

Use the following prompt to test the server:

Write the text "Hello, World!" to "C:/tmp/mcp-stdio-hello.txt" and show me its content.

Expect to see that the write_file and read_file tools were utilized.

MCP SSE Server Samples

Starting the MCP SSE Server

The server must be running before running any sample from this section.

poetry run python -m mcp_servers.file_manager_server_sse

Simple File Management with AI Agent

poetry run python -m sample.mcp.sse.file_management_sample

Observe whether it uses the provided tools.

Simple File Management with AI Manager Agent

poetry run python -m sample.mcp.sse.file_management_managed_sample

Observe whether it uses the provided tools.

Using Just the MCP Server

  1. Run the server using the following command:

    poetry -C <ROOT_DIR> run python -m mcp_servers.file_manager_server_sse
    

    Where <ROOT_DIR> is the directory that contains the mcp_servers directory. This will run a server on port 8000.

  2. Use the following address to attach to an agent:

    http://localhost:8000/sse
    

Use the following prompt to test the server:

Write the text "Hello, World!" to "C:/tmp/mcp-sse-hello.txt" and show me its content.

Expect to see that the write_file and read_file tools were utilized.

Author

Ronaldo Webb

Quick Start

1

Clone the repository

git clone https://github.com/rcw3bb/sample-smolagents
2

Install dependencies

cd sample-smolagents
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerrcw3bb
Reposample-smolagents
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation