anuragg-saxenaa
MCP Serveranuragg-saxenaapublic

mcp test

一个Spring Boot应用,通过Ollama API生成天气信息,展示本地AI模型与REST API的集成。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
TypeScript
Language
-
License

About This Server

一个Spring Boot应用,通过Ollama API生成天气信息,展示本地AI模型与REST API的集成。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Weather API with Ollama Integration

A Spring Boot application that provides weather information using the Ollama API for generating responses. This application demonstrates how to integrate a local AI model (Mistral) with a Spring Boot REST API.

Prerequisites

  • Java 23
  • Maven
  • Ollama installed and running locally
  • Mistral model installed in Ollama

Getting Started

1. Install Ollama

Follow the instructions at Ollama's official website to install Ollama on your system.

2. Install the Mistral Model

Run the following command to download the Mistral model:

ollama pull mistral

3. Build the Application

mvn clean install

4. Run the Application

mvn spring-boot:run

The application will start on port 8080.

API Endpoints

1. Current Weather

GET /api/weather/current/{location}

Example: http://localhost:8080/api/weather/current/New York

2. Weather Forecast

GET /api/weather/forecast/{location}?days={days}

Example: http://localhost:8080/api/weather/forecast/London?days=3

Project Structure

  • com.arrayindex.weatherlangchain - Main package
    • config - Configuration classes
      • OllamaConfig - Configuration for Ollama API client
    • controller - REST controllers
      • WeatherController - Handles weather API requests
    • model - Data models
      • WeatherResponse - Response model for weather information
    • service - Business logic
      • WeatherService - Service for generating weather information using Ollama

Configuration

The application is configured in application.properties:

server.port=8080
ollama.model=mistral

How It Works

  1. The application receives a request for weather information
  2. The request is processed by the WeatherController
  3. The WeatherService generates a prompt for the Ollama API
  4. The Ollama API (using the Mistral model) generates a response
  5. The response is returned to the client

Postman Collection

A Postman collection is provided in the Weather-API.postman_collection.json file. Import this file into Postman to easily test the API endpoints.

Notes

  • This application uses the Ollama API to generate weather information, not real weather data
  • The responses are generated by the AI model based on the prompts
  • The application requires Ollama to be running locally on port 11434 "# mcp-test"

Quick Start

1

Clone the repository

git clone https://github.com/anuragg-saxenaa/mcp-test
2

Install dependencies

cd mcp-test
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Owneranuragg-saxenaa
Repomcp-test
LanguageTypeScript
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation