For an enterprise-grade AI Assistant in 2026, networking and security are the “make-or-break” components. If you are using Terraform and Databricks, you must move away from standard public access and embrace Zero-Trust Networking.
Here is the blueprint for networking and security setup.
1. The Network Backbone: Hub-and-Spoke
To keep your data safe, do not deploy everything into one VNet. Use the Hub-and-Spoke model.
- The Hub: Contains shared services like Azure Firewall, VPN Gateway (for on-prem access), and Centralized DNS Zones.
- The AI Spoke: This is where your Databricks workspace, AI Search, and OpenAI live.
- The Connection: All communication between your spoke and the internet must pass through the Hub’s firewall.
2. Private Link & Managed Identities (No Keys!)
In 2026, API keys are a legacy risk. Your architecture should be “Keyless.”
- Private Endpoints: Disable all public network access for ADLS Gen2, AI Search, and OpenAI. Assign each a Private Endpoint within your Spoke VNet. This ensures your data never touches the public internet.
- Managed Identities (System-Assigned):
- Give your Databricks Cluster a Managed Identity with
Storage Blob Data Contributoron ADLS. - Give your Azure OpenAI resource a Managed Identity to read from AI Search.
- The Result: No secrets to rotate in your Terraform code or Key Vault.
- Give your Databricks Cluster a Managed Identity with
3. Databricks-Specific Security (The Terraform Focus)
The blog post you mentioned focuses on Terraform for Databricks. For high security, your Terraform must include:
- VNet Injection: Do not use the “default” Databricks VNet. Inject Databricks into your own managed VNet with two subnets (public and private).
- No Public IP (NPIP): Enable the “Secure Cluster Connectivity” feature. This ensures your Databricks worker nodes have zero public IP addresses, making them invisible to the internet.
- Unity Catalog + Private Link: Ensure Unity Catalog is configured to use a Private Access Connector. This allows Databricks to talk to your Metadata store without leaving the Azure backbone.
4. Advanced Protection for RAG
Since this assistant handles sensitive internal data, add these two “2026-standard” layers:
- Microsoft Purview Integration: Link your AI Search and OpenAI to Microsoft Purview. This allows you to apply Sensitivity Labels (e.g., “Highly Confidential”). If a document is tagged as such, the AI will refuse to summarize it for a user who doesn’t have that specific clearance.
- AI Content Safety: Place an Azure AI Content Safety layer in front of OpenAI. This detects “Prompt Injection” attacks where a user might try to trick the AI into revealing system prompts or unauthorized data.
Summary Checklist for your Terraform Modules
| Resource | Security Requirement |
| ADLS Gen2 | Firewall enabled; Allow only “Selected Networks” (your VNet). |
| Databricks | enable_no_public_ip = true and VNet Injection enabled. |
| AI Search | public_network_access_enabled = false; Private Endpoint active. |
| OpenAI | Managed Identity enabled; local_auth_enabled = false (forces Entra ID). |
| DNS | Private DNS Zones for privatelink.openai.azure.com and privatelink.blob.core.windows.net. |
Pro-Tip: In your Terraform, use the
azapiprovider if the standardazurermprovider doesn’t yet support the latest 2026 AI Search security features. This allows you to call the Azure Resource Manager API directly for cutting-edge settings.