Table of Contents

Connecting to Microsoft Foundry

Azure AI connection enables Flow actions to interact with Microsoft Foundry resources, such as Large Language Models (LLMs).

Properties

Name Description
Name Name of the connection.
API Key The API key used for authentication.
Endpoint The endpoint URL for the Microsoft Foundry resource.

Creating a new connection

To add an Azure AI action, select an existing Azure AI connection or create a new one.
We'll walk through creating a new connection to a model deployed in Microsoft Foundry.

  1. In the Flowchart, click to select the action that you want to create a connection for.
  2. Select Connection in the property panel.
  3. Toggle Create New Connection on.
  4. Fill in the required fields:
    • Name: Enter a unique name for this connection. Choose a name that makes it easy to understand what the connection is for.
    • API Key: Provide the API key associated with the deployed model.
    • Endpoint: Enter the full URL of the deployed model (it should look something like this: https://MY-PROJECT.openai.azure.com/openai/v1).

To find the API Key and Endpoint, go the the Microsoft Foundry portal and do the following:

  1. In the application top bar, to to Build (upper right corner)
  2. Select Models from the left menu
  3. Select the deployed model
  4. In the Playground tab, switch from Chat to Code view, and select OpenAI SDK from the SDKs dropdown.
  5. Copy the Endpoint URL
  6. In the Details tab, copy the Key (which is the API key)

img


img


Flow 1.11 (December 2025) and earlier

The following documentation applies to Flow 1.11 (December 2025) and earlier

Creating a New Connection

To add an Azure AI action, select an existing Azure AI connection or create a new one.

Important

Whether you want to use an OpenAI or Foundry model, you need to create the connections differently. See details below.


Create a connection to an Azure Foundry model

If you want to use a Foundry model, you can reuse the connection against multiple model deployments.

  1. In the Flowchart, click to select the action that you want to create a connection for.
  2. Select Connection in the property panel.
  3. Toggle Create New Connection on.
  4. Fill in the required fields:
    • Name: Enter a unique name for this connection. Choose a name that makes it easy to understand what the connection is for.
    • API Key: Provide the API key associated with the deployed model.
    • Endpoint: Enter the full URL of the deployed model (e.g., https://xx-m8on1111-eastus2.services.ai.azure.com/models).

To find the API Key and Endpoint, go the the Microsoft Foundry portal and do the following:

  1. Click Models + Endpoints
  2. Select the deployed model
  3. In the SDK dropdown, select Azure AI Inference SDK.
  4. Copy the Endpoint URL
  5. Copy the (API) Key

img


Create a connection to an Azure OpenAI model

If you want to use an OpenAI model, you must create one connection PR model deployment, because the deployment name is part of the Endpoint.

  1. In the Flowchart, click to select the action that you want to create a connection for.
  2. Select Connection in the property panel.
  3. Toggle Create New Connection on.
  4. Fill in the required fields:
    • Name: Enter a unique name for this connection. Choose a name that makes it easy to understand what the connection is for.
    • API Key: Provide the API key associated with the deployed model.
    • Endpoint: Enter the full URL of the deployed model (e.g., https://xx-m8on1111-eastus2.cognitiveservices.azure.com/openai/deployments/gpt-4o-mini). Note that the Endpoint contains the deployment name (gpt-4o-mini).

To find the API Key and Endpoint, go the the Microsoft Foundry portal and do the following:

  1. Click Models + Endpoints
  2. Select the deployed model
  3. In the SDK dropdown, select Azure AI Inference SDK.
  4. Copy the Endpoint URL. Note that the deployment name is in the URL.
  5. Copy the (API) Key

img