Skip to main content

Introduction

The Google Gemini integration lets you connect Rootly to your organization’s Google Gemini API account. Once connected, a new Gemini Chat Completion workflow action becomes available, allowing you to send prompts to Gemini models and capture their responses — directly within your incident and action item workflows. With the Google Gemini integration, you can:
  • Generate AI-powered incident summaries, analyses, and recommendations
  • Send custom prompts to Gemini models with full Liquid template support
  • Use a system prompt to define the model’s role, tone, or output constraints
  • Choose from any Gemini model available on your API account

Before You Begin

Before setting up the Google Gemini integration, make sure you have:
Your API key is validated against the Gemini API when you save the integration. If validation fails, confirm the key is active and has access to at least one Gemini model.

Installation

Open the integrations page in Rootly

Navigate to the integrations page in your Rootly workspace and select Google Gemini.

Enter your API key

Paste your Google AI Studio API key into the API Key field. Rootly validates the key by fetching your available Gemini models before saving. Your key is encrypted at rest in Rootly.

Integration connected

Your Google Gemini integration is active. The Gemini Chat Completion workflow action is now available in your incident and action item workflows.

Workflow Actions

Gemini Chat Completion

Sends a prompt to a Gemini model and captures the response as a workflow output. The model list is fetched dynamically from your API account.
FieldDescriptionRequired
ModelThe Gemini model to use — fetched from your accountYes
PromptThe user message — supports Liquid templatingYes
System PromptInstructions for the model’s role or behavior — supports Liquid templatingNo
Use Liquid variables in your prompts to include live incident context — for example {{ incident.title }}, {{ incident.severity }}, and {{ incident.description }}. See the Liquid variables reference for all available fields.
The System Prompt field sets the model’s persona or output format — for example: “You are an incident response assistant. Respond in bullet points. Be concise.”

Troubleshooting

Rootly validates your API key by fetching the list of available Gemini models when you save. If validation fails, confirm the key is active in Google AI Studio and has not been revoked or restricted.
If the integration was working and then stopped, the API key may have been revoked or rotated. Update the key in the integration settings — Rootly re-validates on save.
Google Gemini enforces rate limits based on your API tier. Running many concurrent workflows may exceed requests-per-minute limits. Consider staggering workflows or upgrading your Google AI Studio plan for higher quota.
The model list is fetched dynamically from your API account and is filtered to Gemini models. If a model you expect is missing, confirm your API key has access to it — some models may require specific API tiers or allowlisting.
Check your Liquid syntax — unclosed tags or undefined variables can cause rendering failures. Use the Liquid variables reference to confirm variable names and test your template in a low-stakes workflow first.

Incident Workflows

Build workflows that use Gemini models to analyze, summarize, or respond to incidents.

Liquid Variables

Reference for all incident variables available in Liquid-templated prompts.

Rootly AI

Learn about Rootly’s built-in AI features for incident management.