Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Ubuntu powers Azure’s confidential AI offering

Jehudi

on 2 October 2024

Microsoft Azure has announced the general availability of their confidential virtual machines (VMs) with NVIDIA H100 Tensor Core GPUs, powered by Ubuntu. This offering combines the hardware-based protection of AMD EPYC processors with NVIDIA’s latest GPU technology to enable secure and high-performance AI workloads in the cloud. Combining these technologies allows sensitive sectors to unlock AI adoption through addressing previous concerns of critical data privacy.

Harnessing confidential computing with Ubuntu

Confidential computing has transformed data security in the cloud. By leveraging hardware-based Trusted Execution Environments (TEEs), confidential computing ensures data privacy and integrity throughout its lifecycle. This becomes particularly important when dealing with sensitive data, such as in healthcare, finance, or the public sector. Confidential computing enables cloud AI processing with strong technical guarantees that the data and model remain confidential, ensuring data protection where it is paramount.

How confidential AI works

Confidential AI on Azure combines the strengths of AMD SEV-SNP technology with NVIDIA H100 GPUs. This architecture provides a fortified environment that secures data in use, in transit, and at rest. Here’s how it works:

  • CPU-TEE: Ubuntu confidential VMs operate on AMD 4th Gen EPYC processors with SEV-SNP, which encrypts and integrity-protects the entire address space of the VM using hardware-managed keys. This prevents unauthorized access or modification by any code outside the TEE.
  • GPU-TEE: The NVIDIA H100 Tensor Core GPUs further extend these protections to the GPU, ensuring that all computations within the GPU are secure.
  • Encrypted PCIe Communication: All traffic between the VM and GPU is encrypted and integrity protected, safeguarding against advanced attacks.
  • Attestation: This feature allows cryptographic verification of the security claims of both CPU and GPU TEEs, ensuring that data is processed only for its intended purpose.

Ubuntu: the foundation of confidential AI

Ubuntu powers these confidential VMs on Azure and are available to launch now. We recommend using Ubuntu Pro for its extended security maintenance of 12 years and additional enterprise-grade capabilities. These features ensure a more comprehensive security posture for your sensitive workloads.

Ubuntu 22.04 LTS:

Ubuntu 24.04 LTS:

NVIDIA’s Perspective

NVIDIA’s collaboration with Canonical enables organizations to harness the full potential of confidential AI computing. Confidential Computing can deliver the performance and protection enterprises need to innovate with confidence in today’s AI landscape”, said Michael O’Connor, Senior Director, Confidential Computing and AI Enterprise Architecture, NVIDIA.

Availability

Azure confidential VMs with NVIDIA H100 GPUs are now generally available in the East US2 and Western Europe regions, with plans for expansion.

Use cases

The general availability of Azure confidential VMs with NVIDIA H100 GPUs unlocks numerous possibilities across various industries, addressing concerns about data privacy and regulatory compliance:

  • Secure AI in sensitive sectors: Industries like healthcare, finance, and government can now leverage powerful AI capabilities without compromising data security. For instance, healthcare providers can analyze patient data to improve diagnoses while ensuring strict compliance with privacy regulations.
  • Collaborative innovation: Organizations can pool resources and data with partners or competitors to train robust AI models without exposing proprietary datasets. This accelerates advancements in fields previously limited by data privacy concerns, such as drug discovery or fraud detection.
  • Protected model deployment: AI service providers can deploy their models at scale with the assurance that their intellectual property remains secure. This prevents unauthorized access to model weights or snapshots, safeguarding proprietary algorithms.
  • Regulatory compliance: Confidential AI enables the use of valuable private data across various domains without violating compliance requirements. Industries subject to strict data protection laws can now leverage advanced AI capabilities while demonstrating verifiable compliance, opening up new avenues for innovation in highly regulated sectors.

Conclusion

The integration of Ubuntu with Azure confidential VMs featuring NVIDIA H100 GPUs sets a new standard for secure AI workloads in the cloud. This collaboration ensures that organizations can leverage powerful AI capabilities while maintaining the highest standards of data privacy and security.
If you have any questions or need further assistance, please contact us. We look forward to helping you navigate the future of AI security and innovation.

Talk to us today

Interested in running Ubuntu in your organisation?

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

Join us for Microsoft Ignite

The Canonical team is gearing up for the next big gathering at Microsoft Ignite 2024, which will take place from November 18 – 22, 2024. Get ready to dive...

Deploying Open Language Models on Ubuntu

Discover the benefits of using Ubuntu for open-source AI and how to seamlessly deploy models on Azure, including leveraging GPU and Confidential Compute capabilities.

Canonical at Google Next – What you need to know

Learn how Canonical and Google Cloud are collaborating to secure and scale solutions for cloud computing at Google Next 2024.