Much like lots of modern day services, confidential inferencing deploys products and containerized workloads in VMs orchestrated employing Kubernetes. Inference runs in Azure Confidential GPU VMs developed using an integrity-shielded disk impression, which includes a container runtime to load the various containers expected for inference. Conver