mkaydin
MCP Servermkaydinpublic

core go ollama llm functions

project_putIN_core_llm_functions

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Go
Language
MIT License
License

About This Server

project_putIN_core_llm_functions

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation


core-go-ollama-llm-functions

A collection of Go-based examples demonstrating how to integrate Large Language Models (LLMs) with function calling capabilities using Ollama.

Overview

This repository showcases various implementations of function calling with LLMs in Go, leveraging Ollama's local model serving capabilities. It includes examples such as:

  • Structured JSON outputs
  • Tool usage and integration
  • Reasoning and inference tasks
  • Retrieval-Augmented Generation (RAG) workflows
  • SQLite-backed data interactions
  • HTTP client-server communication

Each example is designed to illustrate specific aspects of LLM function calling, providing a practical reference for developers.

Prerequisites

  • Go 1.20 or higher
  • Ollama installed and running locally
  • Required LLM models downloaded via Ollama (e.g., Llama 3.1)

Getting Started

  1. Clone the repository:

    git clone https://github.com/mkaydin/core-go-ollama-llm-functions.git
    cd core-go-ollama-llm-functions
    
  2. Install dependencies:

    go mod tidy
    
  3. Run an example:

    Navigate to the desired example directory and execute the Go file. For instance:

    cd structured_output_v1
    go run main.go
    

Project Structure

The repository is organized into the following directories:

  • structured_output_v1/ - Basic structured output example
  • structured_output_v2/ - Advanced structured output with nested data
  • tools_output/ - Demonstrates tool usage within LLM responses
  • reasoning/ - Examples focusing on reasoning capabilities
  • rag_output/ - Retrieval-Augmented Generation implementation
  • mcp_sqlite_docker/ - SQLite integration with Docker support
  • mcp-curl-client/ - HTTP client example using curl
  • mcp-curl-server/ - HTTP server handling LLM interactions
  • json_output/ - JSON output formatting examples

Usage

Each example is self-contained. To run a specific example:

  1. Ensure Ollama is running and the necessary model is available:

    ollama run llama3.1
    
  2. Navigate to the example directory:

    cd <example_directory>
    
  3. Execute the Go program:

    go run main.go
    

Replace <example_directory> with the desired example folder name.

Contributing

Contributions are welcome! If you have suggestions or improvements, feel free to open an issue or submit a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for details.


For more information on Ollama and its capabilities, visit the official Ollama GitHub repository.


Quick Start

1

Clone the repository

git clone https://github.com/mkaydin/core-go-ollama-llm-functions
2

Install dependencies

cd core-go-ollama-llm-functions
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownermkaydin
Repocore-go-ollama-llm-functions
LanguageGo
LicenseMIT License
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation