Skip to main content

Microsoft Foundry quickstart

This article refers to the Microsoft Foundry (new) portal.
In this quickstart, you use Microsoft Foundry to interact with a Foundry model, create, and chat with an agent.

Prerequisites

Set environment variables and get the code

Store your project endpoint as an environment variable. Also set these values for use in your scripts.
PROJECT_ENDPOINT=<endpoint copied from welcome screen>
AGENT_NAME="MyAgent"
MODEL_DEPLOYMENT_NAME="gpt-4.1-mini"
Follow along below or get the code:

Get the code

Sign in using the CLI az login command to authenticate before running your Python scripts.

Install and authenticate

Make sure you install the correct preview/prerelease version of the packages as shown here.
  1. Install these packages, including the preview version of azure-ai-projects. This version uses the Foundry projects (new) API (preview).
    pip install azure-ai-projects --pre
    pip install openai azure-identity python-dotenv
    
  2. Sign in using the CLI az login command to authenticate before running your Python scripts.
Code uses Azure AI Projects 2.x (preview) and is incompatible with Azure AI Projects 1.x. Switch to Foundry (classic) documentation for the Azure AI Projects 1.x GA version.

Chat with a model

Interacting with a model is the basic building block of AI applications. Send an input and receive a response from the model:
import os
from dotenv import load_dotenv
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient

load_dotenv()

print(f"Using PROJECT_ENDPOINT: {os.environ['PROJECT_ENDPOINT']}")
print(f"Using MODEL_DEPLOYMENT_NAME: {os.environ['MODEL_DEPLOYMENT_NAME']}")

project_client = AIProjectClient(
    endpoint=os.environ["PROJECT_ENDPOINT"],
    credential=DefaultAzureCredential(),
)

openai_client = project_client.get_openai_client()

response = openai_client.responses.create(
    model=os.environ["MODEL_DEPLOYMENT_NAME"],
    input="What is the size of France in square miles?",
)
print(f"Response output: {response.output_text}")

C#

using Azure.AI.Projects;
using Azure.AI.Projects.OpenAI;
using Azure.Identity;
using OpenAI.Responses;

#pragma warning disable OPENAI001

string projectEndpoint = Environment.GetEnvironmentVariable("PROJECT_ENDPOINT")
?? throw new InvalidOperationException("Missing environment variable 'PROJECT_ENDPOINT'");
string modelDeploymentName = Environment.GetEnvironmentVariable("MODEL_DEPLOYMENT_NAME")
?? throw new InvalidOperationException("Missing environment variable 'MODEL_DEPLOYMENT_NAME'");

AIProjectClient projectClient = new(new Uri(projectEndpoint), new AzureCliCredential());

ProjectResponsesClient responseClient = projectClient.OpenAI.GetProjectResponsesClientForModel(modelDeploymentName);
ResponseResult response = await responseClient.CreateResponseAsync("What is the size of France in square miles?");

Console.WriteLine(response.GetOutputText());
Code uses Azure AI Projects 2.x (preview) and is incompatible with Azure AI Projects 1.x. Switch to Foundry (classic) documentation for the Azure AI Projects 1.x GA version.

Create an agent

Create an agent using your deployed model. An agent defines core behavior. Once created, it ensures consistent responses in user interactions without repeating instructions each time. You can update or delete agents anytime.
import os
from dotenv import load_dotenv
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient
from azure.ai.projects.models import PromptAgentDefinition

load_dotenv()

project_client = AIProjectClient(
    endpoint=os.environ["PROJECT_ENDPOINT"],
    credential=DefaultAzureCredential(),
)

agent = project_client.agents.create_version(
    agent_name=os.environ["AGENT_NAME"],
    definition=PromptAgentDefinition(
        model=os.environ["MODEL_DEPLOYMENT_NAME"],
        instructions="You are a helpful assistant that answers general questions",
    ),
)
print(f"Agent created (id: {agent.id}, name: {agent.name}, version: {agent.version})")
Code uses Azure AI Projects 2.x (preview) and is incompatible with Azure AI Projects 1.x. Switch to Foundry (classic) documentation for the Azure AI Projects 1.x GA version.

Chat with an agent

Use the previously created agent named “MyAgent” to interact by asking a question and a related follow-up. The conversation maintains history across these interactions.
import os
from dotenv import load_dotenv
from azure.identity import DefaultAzureCredential
from azure.ai.projects import AIProjectClient

load_dotenv()

project_client = AIProjectClient(
    endpoint=os.environ["PROJECT_ENDPOINT"],
    credential=DefaultAzureCredential(),
)

agent_name = os.environ["AGENT_NAME"]
openai_client = project_client.get_openai_client()

# Optional Step: Create a conversation to use with the agent
conversation = openai_client.conversations.create()
print(f"Created conversation (id: {conversation.id})")

# Chat with the agent to answer questions
response = openai_client.responses.create(
    conversation=conversation.id, #Optional conversation context for multi-turn
    extra_body={"agent": {"name": agent_name, "type": "agent_reference"}},
    input="What is the size of France in square miles?",
)
print(f"Response output: {response.output_text}")

# Optional Step: Ask a follow-up question in the same conversation
response = openai_client.responses.create(
    conversation=conversation.id,
    extra_body={"agent": {"name": agent_name, "type": "agent_reference"}},
    input="And what is the capital city?",
)
print(f"Response output: {response.output_text}")
Code uses Azure AI Projects 2.x (preview) and is incompatible with Azure AI Projects 1.x. Switch to Foundry (classic) documentation for the Azure AI Projects 1.x GA version.

Clean up resources

If you no longer need any of the resources you created, delete the resource group associated with your project.
  • In the Azure portal, select the resource group, and then select Delete. Confirm that you want to delete the resource group.

Next step

Idea to prototype - Build and evaluate an enterprise agent