logo
Published on

Azure OpenAI secure deployment options

artificial-intelligence
Authors

A Journey Towards Achieving Secure Azure OpenAI Deployments

This guide provides an overview, implementation steps, and things to consider when deploying Azure OpenAI Instances.

1. Public OpenAI Endpoint using API Key Authentication

Overview: This method involves using an API key to authenticate requests to the OpenAI endpoint.

Implementation Steps:

  1. Sign up and subscribe to Azure OpenAI service to get your API keys and endpoint
  2. Create a model deployment
  3. Call the API endpoint from your client using the API key for authentication
  4. NOTE: It is considered a best practice to store your keys in Azure Key Vault rather than managing them manually in your code. However, even with this method, you still need to handle the connection string to Azure Key Vault. The following sections will discuss some better ways to handle this.

Pros:

  • Simple to implement.
  • Provides a basic level of security.

Cons:

  • If the API key is compromised, unauthorised users can access the endpoint

2. Public OpenAI Endpoint using Managed Identity authentication through Microsoft Entra ID

Overview: This method uses Azure Active Directory (AAD) and Managed Identities for Azure resources for authentication.

Implementation Steps:

  1. Create a Managed Identity in the Azure portal.
  2. Assign the Managed Identity to the service that will call the Open AI endpoint e.g. Azure Function or Azure Container App
  3. Use the Managed Identity to authenticate requests to your OpenAI endpoints

Pros:

  • Provides a higher level of security than API key authentication.
  • Managed Identities are automatically managed by Azure.

Cons:

  • Although the access keys are now more secure, the endpoint is still exposed to the public internet which may not be acceptable based on your organizational requirements.

3. Combine Private Endpoints and Managed Identity authentication for more security and privacy

Overview:

  • Create a more secure connection to OpenAI services using Azure Private Endpoints, ensuring traffic between Azure services and OpenAI does not traverse the public internet.

Implementation Steps:

  1. Create a VNET and a subnet for the Private Endpoint
  2. Set up a Private Link by creating a Private Endpoint in your Azure VNET.
  3. Assign the Managed Identity to the service that will call the Open AI endpoint e.g. Azure Function or Azure Container App.
  4. Assign the service that will call the Open AI endpoint to the same VNET where the Private Endpoint is deployed
  5. Use the Managed Identity to authenticate requests to your OpenAI endpoints

Pros:

  • Ensures that data does not traverse the public internet, providing better security and privacy.
  • Combines the security of Managed Identities with the network isolation of Private Endpoints. i.e. Defense In Depth

Cons:

  • Setting up a Private Endpoint and configuring network security can be complex and potentially increase costs.

4. Add Azure Front Door for High-Availability Multi-Region Deployment

Overview:

  • Implement Azure Front Door to distribute traffic across multiple regions, enhancing availability and performance.

Implementation Steps:

  1. Create an Azure Front Door instance and configure its routing rules to distribute traffic across your Azure services deployed in multiple regions.
  2. Set up health probes to monitor the availability of your services in different regions and route traffic accordingly.
  3. Define failover policies to automatically redirect traffic in case of regional outages or performance issues.

Pros:

  • Improves the availability and performance of applications on a global scale.
  • Offers seamless failover capabilities to ensure uninterrupted service.

Cons:

  • The multi-region setup and Azure Front Door can significantly increase costs.
  • This setup requires careful planning and configuration to ensure effective traffic distribution and failover.

5. Use Azure Front Door and Multi-Region APIM with Private Endpoints by leveraging VNET Peering

Overview:

  • Combine Azure Front Door with Azure API Management (APIM) instances deployed in multiple regions and secured with Private Endpoints. Use VNET peering for secure, inter-VNET communication.

Implementation Steps:

  1. APIM Deployment: Deploy APIM instances in multiple regions, each within its own VNET.
  2. Private Endpoints: Secure these APIM instances with Private Endpoints.
  3. VNET Peering: Establish VNET peering between the VNETs of your APIM instances and your multi-region OpenAI Instances
  4. Integrate with Front Door: Configure Azure Front Door to route external traffic to the regional APIM instances, ensuring high availability and performance.

Pros:

  • Combines secure network architecture with global reach and failover capabilities.
  • Ensures secure inter-service communication through VNET peering and Private Endpoints.

Cons:

  • This is among the most complex and costly options, requiring advanced Azure networking knowledge.
  • Managing multi-region deployments and networking configurations demands significant operational effort.

Why use Azure Front Door and APIM at the same time?

In option 5, the use of Azure API Management (APIM) in conjunction with Azure Front Door and Private Endpoints is a strategic choice that brings several key benefits, warranting its necessity in a high-availability, multi-region deployment. APIM acts as a critical layer that abstracts the backend services from the client applications, providing a centralized gateway for managing, securing, and optimizing the API calls. Here's why APIM is necessary in this setup:

  • Enhanced Security: By integrating APIM with Private Endpoints, you're able to enforce more stringent security policies, including IP whitelisting, OAuth, and rate limiting. This ensures that only authorized consumers can access your APIs, significantly reducing the risk of attacks and unauthorized access.
  • Performance Optimization: APIM can cache responses at the edge, reducing the number of round trips to the backend services. This, combined with Azure Front Door's global routing, optimizes the performance and latency of API calls, providing a faster and more reliable experience for users worldwide.
  • Centralized Management: APIM allows for the central management of APIs, simplifying the process of publishing, documenting, and analyzing APIs across all regions. This centralization is crucial for maintaining consistency in API behaviour and policies, regardless of the region from which the services are accessed.

Of course, you could choose to remove the APIM layer and just use Azure Front Door as global load balance in your multi-region Azure OpenAI deployment, however without APIM, you would lose the ability to manage and secure your APIs centrally, which could lead to inconsistencies and potential security risks. You would also lose the performance benefits of edge caching, which could result in slower response times for your users.

Azure OpenAI Landing Zone Reference Architecture Image

Fig 1.1: Azure OpenAI Landing Zone Reference Architecture


Closing Thoughts

As is usually the case when working with technology, the best option often depends on your unique scenario. Here are some of the things we suggest you consider:

  • Security vs. Simplicity: Balancing security requirements with the implementation complexity is crucial. More secure options tend to require more complex setups.
  • Cost: Higher security and availability often come with increased costs. Assess the budget against the need for high availability and advanced security features.
  • Technical Expertise: Some options require deep knowledge of Azure's networking and security features. Ensure your team has the necessary expertise or is willing to invest in learning.
  • Use Case Sensitivity: The nature of your application—its sensitivity, data privacy requirements, and expected load—will heavily influence your choice. Highly sensitive applications might necessitate more secure and isolated options, whereas less critical applications might opt for simplicity and lower costs

Related links