Terraform Security:

Last Updated on August 7, 2025 by Arnav Sharma

In the rapidly evolving artificial intelligence, leveraging powerful AI models like OpenAI’s GPT series has become a cornerstone for developing intelligent applications. Microsoft Azure provides a platform through its Azure OpenAI Service, allowing users to deploy and manage AI models efficiently. This blog provides a step-by-step guide on using HashiCorp Terraform to provision the Azure OpenAI Service, thereby simplifying the integration of advanced AI capabilities into your applications.

Prerequisites

Before diving into the Terraform script, ensure you have the following prerequisites in place:

  • Azure Subscription: You need a valid Azure subscription where you can deploy resources.
  • Terraform Installed: Ensure you have Terraform installed on your machine.
  • Azure CLI: The Azure CLI is required for Azure authentication purposes.

Step 1: Configure Terraform Azure Providers

The first step is to configure Terraform with the necessary providers. For deploying the Azure OpenAI Service, we primarily use the azurerm provider.

terraform {
  required_providers {
    azurerm = {
      source  = "hashicorp/azurerm"
      version = "~>3.0"
    }
  }
  required_version = ">= 0.14"
}

provider "azurerm" {
  features {}
}

This configuration block sets up the AzureRM provider, specifying version constraints and the required Terraform version.

Step 2: Define the Resource Group

A resource group in Azure is a container that holds related resources for an Azure solution. Here, we define a resource group for our OpenAI service:

resource "azurerm_resource_group" "openai_rg" {
  name     = "OpenAI-ResourceGroup"
  location = "East US"
}

Step 3: Create Azure OpenAI Service

With the resource group in place, the next step is to create the Azure OpenAI Service instance. We use the azurerm_cognitive_account resource type, setting the kind to OpenAI to specify that this account will host an instance of the OpenAI models.

resource "azurerm_cognitive_account" "openai_service" {
  name                = "openai-service"
  location            = azurerm_resource_group.openai_rg.location
  resource_group_name = azurerm_resource_group.openai_rg.name
  kind                = "OpenAI"
  sku_name            = "S0"  # Standard tier

  properties {
    public_network_access_enabled = true
  }

  tags = {
    environment = "production"
  }
}

This configuration creates a cognitive service account tailored for OpenAI applications, in the standard pricing tier.

Step 4: Output Configuration

After provisioning the resources, it’s helpful to output the configuration details. This enables easy access to important information, such as the endpoint of the OpenAI service.

output "openai_service_endpoint" {
  value = azurerm_cognitive_account.openai_service.endpoint
}

output "openai_service_primary_key" {
  value = azurerm_cognitive_account.openai_service.primary_access_key
}

output "openai_service_primary_key" {
  value = azurerm_cognitive_account.openai_service.primary_access_key
}

Create GPT-35-Turbo Deployment in Azure OpenAI Service using Terraform

Prerequisites

Before you begin, ensure you have the following:

  • An Azure subscription: You need an active subscription where you can deploy resources.
  • Terraform installed: Make sure Terraform is installed on your machine.
  • Azure CLI installed: Necessary for authenticating to your Azure account.

Step 1: Configure Terraform Providers

You need to set up both the azurerm and azapi providers in Terraform. The azapi provider will allow us to configure resources that are not yet fully supported by the azurerm provider.

terraform {
  required_providers {
    azurerm = {
      source  = "hashicorp/azurerm"
      version = "~>3.0"
    }
    azapi = {
      source = "azure/azapi"
    }
  }
  required_version = ">= 0.14"
}

provider "azurerm" {
  features {}
}

provider "azapi" {}

The azapi Terraform provider is a specialized tool designed to interface directly with Azure’s REST API, allowing users to manage Azure resources that may not be fully supported or available in the more commonly used azurerm provider. This flexibility makes azapi particularly useful for advanced Azure configurations and for deploying newer or preview features that are accessible via Azure’s REST API but not yet exposed through the standard AzureRM Terraform provider.

Key Features of azapi

Here are some of the key features and uses of the azapi provider:

  1. Direct API Access: azapi enables direct interactions with the Azure Management REST API, bypassing the limitations of the existing Terraform providers. It sends raw JSON payloads to the API, which can be dynamically constructed based on Terraform’s configurations.
  2. Support for Latest Azure Features: Since new features in Azure are exposed through the REST API before they are integrated into the azurerm provider, azapi allows users to implement these features without waiting for provider updates.
  3. Custom Resource Management: Users can manage virtually any Azure resource with azapi as long as there is API support for that resource, offering extensive customizability and control over Azure deployments.
  4. Complementary to AzureRM: Rather than replacing azurerm, azapi is often used alongside it to manage resources that are not yet supported by azurerm, making it a powerful complement to the capabilities already available in Terraform for Azure.

How It Works

The azapi provider works by making HTTP requests to the Azure Management REST API. It uses the Azure Active Directory for authentication, similar to other Azure services. Here’s a brief overview of how azapi operates:

  • Initialization: Like other Terraform providers, you declare and configure the azapi provider in your Terraform scripts.
  • Resource Definition: You define resources using the azapi_resource block, where you specify the API endpoint, the type of the resource, and other necessary details.
  • Resource Configuration: Instead of predefined properties, you use a JSON blob (body) to configure the resource’s properties according to the schema expected by the Azure API.

Step 2: Define Azure Resources

First, define the resource group and the Azure Cognitive Services account that will host the OpenAI instance.

resource "azurerm_resource_group" "openai_rg" {
  name     = "OpenAI-ResourceGroup"
  location = "East US"
}

resource "azurerm_cognitive_account" "openai_service" {
  name                = "openai-service"
  location            = azurerm_resource_group.openai_rg.location
  resource_group_name = azurerm_resource_group.openai_rg.name
  kind                = "OpenAI"
  sku_name            = "S0"
}

Step 3: Deploy GPT-35-Turbo Model

Now, deploy the GPT-35-turbo model using the azapi_resource resource. This resource allows us to make direct REST API calls to Azure to create and manage Azure resources.

resource "azapi_resource" "gpt_35_turbo_deployment" {
  type = "Microsoft.CognitiveServices/accounts/deployments@2023-05-01"
  name = "gpt-35-turbo-deployment"
  parent_id = azurerm_cognitive_account.openai_service.id

  body = jsonencode({
    properties = {
      model = {
        format = "OpenAI"
        name = "gpt-35-turbo"
        version = "0613"
      },
      versionUpgradeOption = "OnceCurrentVersionExpired",
      raiPolicyName = "Microsoft.Default"
    },
    sku = {
      capacity = 120  # Adjust capacity based on your needs
      name = "Standard"
    }
  })
}

This configuration creates a deployment for the GPT-35-turbo model within your Azure OpenAI service account. It sets the model details and specifies the SKU for the deployment, which determines the capacity and scale settings.

Step 4: Outputs

Finally, to easily retrieve important information about your deployment, configure outputs as follows:

output "openai_service_endpoint" {
  value = azurerm_cognitive_account.openai_service.endpoint
}

output "openai_service_primary_key" {
  value = azurerm_cognitive_account.openai_service.primary_access_key
}

output "gpt_35_turbo_deployment_id" {
  value = azapi_resource.gpt_35_turbo_deployment.id
}

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.