haraprasadj
MCP Serverharaprasadjpublic

gen3 mcp client

Gen3 MCP Client

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Python
Language
-
License

About This Server

Gen3 MCP Client

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Gen3 MCP Client

A Python client application that leverages the Model Context Protocol (MCP) to interact with Gen3 data services. This client enables AI-powered analysis of research studies and data using either Anthropic's Claude or Ollama models.

Features

  • Connects to Gen3 data services through MCP
  • Supports multiple LLM backends:
    • Anthropic Claude (claude-3-5-sonnet)
    • Ollama (qwen3:14b)
  • Environment variable configuration
  • Asynchronous operation
  • Flexible MCP server configuration

Prerequisites

  • Python 3.12 or higher
  • uv package manager
  • Access to either Anthropic API or Ollama

Installation

  1. Clone the repository:
git clone [repository-url]
cd gen3-mcp-client
  1. Install dependencies using uv:
uv sync

Configuration

  1. Create a .env file in the project root with your credentials:
ANTHROPIC_API_KEY=your_api_key_here  # If using Claude
  1. Configure MCP servers in mcp_servers/mds_mcp.json:

Running locally

{
    "mcpServers": {
        "gen3": {
            "command": "uv",
            "args": [
                "--directory",
                "/path/to/gen3-mcp-server",
                "run",
                "gen3.py"
            ]
        }
    }
}

Using dockerized MCP server

{
    "mcpServers": {
        "gen3": {
            "command": "docker",
            "args": [
                "run",
                "-i",
                "--rm",
                "gen3-mcp-server"
            ]
        }
    }
}

Usage

Run the client:

uv run client.py

The client will:

  1. Connect to the configured Gen3 MCP server
  2. Initialize the specified LLM (Claude or Ollama)
  3. Process queries about research studies and data
  4. Output analysis results

Development

To switch between LLM backends, modify the LLM initialization in client.py:

For Claude:

llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")

For Ollama:

llm = ChatOllama(model="qwen3:14b")

License

[Add your license information here]

Quick Start

1

Clone the repository

git clone https://github.com/haraprasadj/gen3-mcp-client
2

Install dependencies

cd gen3-mcp-client
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerharaprasadj
Repogen3-mcp-client
LanguagePython
License-
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation