Skip to main content

Introduction

The Azure OpenAI integration lets you connect Rootly to AI models running inside your organization’s Azure subscription. Rather than using Rootly’s default OpenAI account, requests are routed through your own Azure OpenAI resource — giving you control over data residency, compliance posture, content filtering policies, and the specific model deployment your team has provisioned. With the Azure OpenAI integration, you can:
  • Generate AI-powered summaries, analyses, and responses within incident workflows
  • Send custom prompts to your Azure-deployed GPT models with full Liquid template support
  • Use a system prompt to define the model’s role, tone, or output constraints
  • Satisfy data residency and enterprise compliance requirements by keeping requests within Azure infrastructure

Before You Begin

Before setting up the Azure OpenAI integration, make sure you have:
  • A Rootly account with permission to manage integrations
  • An active Azure OpenAI resource with at least one model deployed
  • Your Azure OpenAI API key, resource name, and deployment name (details below)
Azure OpenAI resources are not automatically created with your Azure subscription. You must request access and deploy a model in the Azure Portal before connecting to Rootly.

Finding Your Credentials

API Key
  1. Go to the Azure Portal and open your Azure OpenAI resource.
  2. Select Keys and Endpoint from the left menu.
  3. Copy either KEY 1 or KEY 2.
Resource Name Your resource name is the subdomain portion of your Azure OpenAI endpoint URL. For example, if your endpoint is:
https://my-company-openai.openai.azure.com/
Your resource name is my-company-openai. Deployment Name
  1. In the Azure Portal, open your Azure OpenAI resource.
  2. Select Model deployments or Deployments from the left menu.
  3. Copy the name of the deployment you want Rootly to use (for example, gpt-4o or gpt-35-turbo).

Installation

Open the integrations page in Rootly

Navigate to the integrations page in your Rootly workspace and select Azure OpenAI.

Enter your credentials

Enter your API Key, Resource Name, and Deployment Name into the respective fields. Rootly constructs your Azure OpenAI endpoint as:
https://{resource_name}.openai.azure.com/openai/deployments/{deployment_name}
Your API key is encrypted at rest in Rootly.

Save and connect

Your Azure OpenAI integration is active. The Create OpenAI Chat Completion workflow action is now available in your incident and action item workflows, routing requests through your Azure deployment.

Workflow Actions

Create OpenAI Chat Completion

Sends a prompt to your Azure-deployed model and captures the response as a workflow output. This is the same action used by the standard OpenAI integration — when Azure OpenAI is connected, Rootly routes requests through your Azure resource automatically.
FieldDescriptionRequired
ModelYour configured Azure deployment — set by the deployment name you providedYes
PromptThe user message — supports Liquid templatingYes
System PromptInstructions for the model’s role or behavior — supports Liquid templatingNo
TemperatureSampling temperature between 0.0 and 2.0 — controls randomnessNo
Max TokensMaximum number of tokens in the responseNo
Top PNucleus sampling probability between 0.0 and 1.0No
Use Liquid variables in your prompts to include live incident context — for example {{ incident.title }}, {{ incident.severity }}, and {{ incident.description }}. See the Liquid variables reference for all available fields.
The System Prompt field sets the model’s persona or output format — for example: “You are an incident response assistant. Respond in bullet points. Be concise.”

Troubleshooting

Confirm that the API key is active and has not been revoked. In the Azure Portal, go to Keys and Endpoint and verify the key is still valid. Also confirm that the resource name matches the Azure OpenAI resource the key belongs to — using a key from a different resource will cause authentication to fail.
If the integration was working and then stopped, the API key may have been rotated in Azure. Update the key in the integration settings. Also check that the deployment name has not been renamed or deleted in the Azure Portal.
Azure OpenAI enforces rate limits based on your provisioned capacity (tokens per minute and requests per minute). Running many concurrent Rootly workflows may exceed these limits. Consider staggering workflows, reducing token usage with more focused prompts, or increasing your Azure OpenAI quota in the Azure Portal.
The model behavior is determined by the deployment you configured — Rootly uses your deployment name directly and does not select a model. If you need a different model, create a new deployment in the Azure Portal and update the deployment name in Rootly.
Check your Liquid syntax — unclosed tags or undefined variables can cause rendering failures. Use the Liquid variables reference to confirm variable names and test your template in a low-stakes workflow first.

OpenAI

Use Rootly’s standard OpenAI integration if you don’t need Azure-specific data residency or compliance controls.

Incident Workflows

Build workflows that use Azure OpenAI models to analyze, summarize, or respond to incidents.

Liquid Variables

Reference for all incident variables available in Liquid-templated prompts.