eamonoreilly
MCP Servereamonoreillypublic

azure functions openai aisearch mcp dotnet

包含使用 OpenAI 扩展和 Azure AI 搜索的 .NET Azure Functions 示例。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
Bicep
Language
MIT License
License

About This Server

包含使用 OpenAI 扩展和 Azure AI 搜索的 .NET Azure Functions 示例。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

Azure Functions

Using Azure Functions OpenAI trigger and bindings extension to import data and query with Azure Open AI and Azure AI Search and exposed as an MCP tool

This sample contains an Azure Function using OpenAI bindings extension to highlight OpenAI retrieval augmented generation with Azure AI Search.

You can learn more about the OpenAI trigger and bindings extension in the GitHub documentation and in the Official OpenAI extension documentation

Information about Model Context Protocol tools in Azure Functions is available on in the Azure Functions MCP blog

Prerequisites

  • Azure Functions Core Tools v4.x
  • Azure OpenAI resource
  • Azure AI Search resource
  • Azurite
  • Azure Developer CLI to create Azure resources automatically - recommended

Prepare your local environment

Create Azure OpenAI and Azure AI Search resources for local and cloud dev-test

Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI, Azure AI Search and other resources needed: You will be asked if you want to enable a virtual network that will lock down your OpenAI and AI Search services so they are only available from the deployed function app over private endpoints. To skip virtual network integration, select true. If you select networking, your local IP will be added to the OpenAI and AI Search services so you can debug locally.

azd init --template https://github.com/eamonoreilly/azure-functions-openai-aisearch-mcp-dotnet

Make sure to run this before calling azd to provision resources so azd can run scripts required to setup permissions

Mac/Linux:

chmod +x ./infra/scripts/*.sh 

Windows:

set-executionpolicy remotesigned

Run the follow command to provision resoruces in Azure

azd provision

If you don't run azd provision, you can create an OpenAI resource and an AI Search resource in the Azure portal to get your endpoints. After it deploys, click Go to resource and view the Endpoint value. You will also need to deploy a model, e.g. with name chat with model gpt-35-turbo and embeddings with model text-embedding-3-small

Create local.settings.json (Should be in the same folder as host.json. Automatically created if you ran azd provision)

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
    "AZURE_OPENAI_ENDPOINT": "<paste from above>",
    "CHAT_MODEL_DEPLOYMENT_NAME": "chat",
    "AZURE_AISEARCH_ENDPOINT": "<paste from above>",
    "EMBEDDING_MODEL_DEPLOYMENT_NAME": "embeddings",
    "SYSTEM_PROMPT": "You must only use the provided documents to answer the question"
    }
}

Permissions

Add your account (your account email, for example: contoso@microsoft.com) with the following permissions to the Azure OpenAI and AI Search resources when testing locally.

If you used azd provision this step is already done - your logged in user and your function's managed idenitty already have permissions granted.

  • Cognitive Services OpenAI User (OpenAI resource)
  • Azure Search Service Contributor (AI Search resource)
  • Azure Search Index Data Contributor (AI Search resource)

Access to Azure OpenAI and Azure AI Search with virtual network integration

If you selected virtual network integration, access to Azure OpenAI and Azure AI Search is limited to the Azure Function app through private endpoints and cannot be reached from the internet. To allow testing from your local machine, you need to go to the networking tab in Azure OpenAI and Azure AI Search and add your client ip to the allowed list. If you used azd provision this step is already done.

Run your app using Visual Studio Code

1)Run and Debug F5 the app or open a new terminal window in the ./app folder and enter func start 2) Using your favorite REST client, e.g. RestClient in VS Code, PostMan, curl, make a post. test.http has been provided to run this quickly.

Add the Functions endpoint as an MCP Server

Open up the .vscode/mcp.json and start the local MCP server that points to the running function app

Deploy to Azure

Run this command to provision the function app, with any required Azure resources, and deploy your code:

azd up

You're prompted to supply these required deployment parameters:

ParameterDescription
Environment nameAn environment that's used to maintain a unique deployment context for your app. You won't be prompted if you created the local project using azd init.
Azure subscriptionSubscription in which your resources are created.
Azure locationAzure region in which to create the resource group that contains the new Azure resources. Only regions that currently support the Flex Consumption plan are shown.

After publish completes successfully, azd provides you with the URL endpoints of your new functions, but without the function key values required to access the endpoints. To learn how to obtain these same endpoints along with the required function keys, see Invoke the function on Azure in the companion article Quickstart: Create and deploy functions to Azure Functions using the Azure Developer CLI.

Test MCP server in Azure Functions

Open up the .vscode/mcp.json and start the remote function app. It will ask for the function name and also a key. The system key can be obtained from the portal (under keys) or the CLI (az functionapp keys list --resource-group <resource_group> --name <function_app_name>)

Redeploy your code

You can run the azd up command as many times as you need to both provision your Azure resources and deploy code updates to your function app.

NOTE

Deployed code files are always overwritten by the latest deployment package.

Clean up resources

When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs (--purge does not leave a soft delete of AI resource and recovers your quota):

azd down --purge

Quick Start

1

Clone the repository

git clone https://github.com/eamonoreilly/azure-functions-openai-aisearch-mcp-dotnet
2

Install dependencies

cd azure-functions-openai-aisearch-mcp-dotnet
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownereamonoreilly
Repoazure-functions-openai-aisearch-mcp-dotnet
LanguageBicep
LicenseMIT License
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation