
core go ollama llm functions
project_putIN_core_llm_functions
Repository Info
About This Server
project_putIN_core_llm_functions
Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.
Documentation
core-go-ollama-llm-functions
A collection of Go-based examples demonstrating how to integrate Large Language Models (LLMs) with function calling capabilities using Ollama.
Overview
This repository showcases various implementations of function calling with LLMs in Go, leveraging Ollama's local model serving capabilities. It includes examples such as:
- Structured JSON outputs
- Tool usage and integration
- Reasoning and inference tasks
- Retrieval-Augmented Generation (RAG) workflows
- SQLite-backed data interactions
- HTTP client-server communication
Each example is designed to illustrate specific aspects of LLM function calling, providing a practical reference for developers.
Prerequisites
- Go 1.20 or higher
- Ollama installed and running locally
- Required LLM models downloaded via Ollama (e.g., Llama 3.1)
Getting Started
-
Clone the repository:
git clone https://github.com/mkaydin/core-go-ollama-llm-functions.git cd core-go-ollama-llm-functions -
Install dependencies:
go mod tidy -
Run an example:
Navigate to the desired example directory and execute the Go file. For instance:
cd structured_output_v1 go run main.go
Project Structure
The repository is organized into the following directories:
structured_output_v1/- Basic structured output examplestructured_output_v2/- Advanced structured output with nested datatools_output/- Demonstrates tool usage within LLM responsesreasoning/- Examples focusing on reasoning capabilitiesrag_output/- Retrieval-Augmented Generation implementationmcp_sqlite_docker/- SQLite integration with Docker supportmcp-curl-client/- HTTP client example using curlmcp-curl-server/- HTTP server handling LLM interactionsjson_output/- JSON output formatting examples
Usage
Each example is self-contained. To run a specific example:
-
Ensure Ollama is running and the necessary model is available:
ollama run llama3.1 -
Navigate to the example directory:
cd <example_directory> -
Execute the Go program:
go run main.go
Replace <example_directory> with the desired example folder name.
Contributing
Contributions are welcome! If you have suggestions or improvements, feel free to open an issue or submit a pull request.
License
This project is licensed under the MIT License. See the LICENSE file for details.
For more information on Ollama and its capabilities, visit the official Ollama GitHub repository.
Quick Start
Clone the repository
git clone https://github.com/mkaydin/core-go-ollama-llm-functionsInstall dependencies
cd core-go-ollama-llm-functions
npm installFollow the documentation
Check the repository's README.md file for specific installation and usage instructions.
Repository Details
Recommended MCP Servers
Discord MCP
Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.
Knit MCP
Connect AI agents to 200+ SaaS applications and automate workflows.
Apify MCP Server
Deploy and interact with Apify actors for web scraping and data extraction.
BrowserStack MCP
BrowserStack MCP Server for automated testing across multiple browsers.
Zapier MCP
A Zapier server that provides automation capabilities for various apps.