Your AI Provider Doesn’t Need to See Your Data to Work on It
Dec 10, 2025
Data protection is one of the biggest barriers to using AI, especially if you’re in an enterprise with strict regulations about how you can use data. The fastest way to get around this roadblock is simple: never let your AI or cloud infrastructure provider see your data. At the same time, companies are rushing to unlock innovation and business potential by tapping into the zettabytes of available data.
But how can this possibly work?
Confidential computing isolates AI models, data, and workloads in hardware-based trusted execution environments (TEEs) with encrypted and integrity-protected memory. Models can work on data inside these protected environments without exposing that data, or the models themselves, to the host operating system, hypervisor, or cloud provider.
Our confidential computing technology, AMD Secure Encrypted Virtualization (SEV), helps protect both proprietary AI models and sensitive data during training and inference, with verifiable attestation throughout the AI development lifecycle. All of the leading cloud providers have deployed AMD SEV at scale, so customers large and small can leverage the industry's most mature and broad confidential computing ecosystem.1 This means you can use private data in a public cloud while AMD SEV helps ensure your data won’t be exposed.
We’re excited about how confidential computing is accelerating AI innovation in the real world and facilitating the collaboration in critical areas like healthcare, financial services, and information technology that would not be possible otherwise. Here are some of the most recent solutions built on AMD SEV.
Unlocking zettabytes of data for real change
It’s astounding to realize that the world of data now exists in terms of zettabytes: that’s 1 billion terabytes. And a good portion of it needs to be kept secure. But what if we can learn from that data without compromising its privacy? There’s unlimited potential in what we can do.
Take medical data, for example. By combining patient data from clinics around the world, researchers can learn more about diseases like cancer and understand which treatments lead to the best outcomes. However, patient data is protected by the Health Insurance Portability and Accountability Act (HIPAA) in the U.S. and other data privacy regulations elsewhere. With confidential computing, researchers can run AI models on sensitive patient data inside hardware-protected environments in the cloud, without exposing that data to the cloud provider or other unauthorized parties, and share the results with fellow researchers, healthcare providers, and pharmaceutical teams — all without revealing the underlying source data.
One company making this happen is BeeKeeperAI. BeeKeeperAI uses AMD SEV on Microsoft Azure to create confidential VMs that isolate data from the cloud service provider, the data owner, the algorithm owner, and even BeeKeeperAI itself. This creates a secure environment for multi-party AI collaboration, which is useful for accelerating medical research and discovery. Patient data, AI models, and algorithms can remain protected by AMD SEV memory encryption and isolation while in use inside the confidential VM, so no one outside the VM can directly inspect them. Data owners and model developers keep full control and maintain the confidentiality of their assets.
Without the confidence that all this healthcare data will be handled in accordance with privacy standards, the medical use cases for AI would be sorely limited.
AI services that are confidential by design to protect privacy
Trust is everything in private conversations, whether personal or professional, and it’s also a huge opportunity in the ecosystem. Providers that build AI services with confidentiality at the core can offer smarter, more personalized experiences without asking users to trade away their privacy. Confidential computing strengthens this further by helping protect data even while it’s being processed, keeping sensitive content shielded from providers and their infrastructure. Companies that offer privacy can earn deeper loyalty, attract more users, and stand out in a crowded market.
In 2025, Meta released Private Processing for WhatsApp, which gives WhatsApp users the ability to use Meta AI for processing messages while preserving privacy. Users can work in a confidential environment in which no one, including WhatsApp, can access messages. They can then use AI to summarize conversations, create messages, and generate images.
Private Processing for WhatsApp runs on AMD SEV confidential AI, which isolates WhatsApp conversations, AI pipelines, and models. Privacy is verified by remote attestation to ensure the integrity and authenticity of the system.
Identifying financial fraud without sharing sensitive data
All over the world, millions of financial transactions are processed at any given moment. During these transactions, data must be kept secure to protect people, businesses, and their financial institutions. Ironically, tight security creates opportunities for fraud, like duplicate financing.
Duplicate financing is a fraudulent scheme in which a company or person applies for credit at more than one institution using the same collateral, much like a home buyer attempting to take out multiple mortgages on the same house. Because these institutions can’t see each other’s pipelines, they can miss these duplicate applications.
MonetaGo uses AMD SEV and Confidential Google Kubernetes Engine (GKE) on Google Cloud Platform to help lenders spot fraud without exposing any source data. Banks can share their lending documents with MonetaGo, which creates privacy-preserving digital fingerprints and flags duplicates without breaching privacy. By helping reduce fraud risk in finance, MonetaGo is making working capital easier to access for millions of underserved businesses.
Innovate with AMD SEV and our open ecosystem for confidential AI
We’re only just getting started with what’s possible with confidential computing. AMD SEV empowers organizations of all kinds to innovate by safely processing private or regulated data and protecting AI intellectual property. So much of what we can do with the combined power of AI and zettabytes of data would not be achievable without the multi-party, confidential data sharing.
At AMD, we’re working with an entire industry of technology providers to bring confidential AI to new use cases. In 2023, we published AMD SEV firmware source code to provide the ecosystem assurance through transparency.
We believe that trusted I/O is the next step in confidential computing, and we’re supporting the TEE Device Interface Security Protocol (TDISP) on 5th Gen AMD EPYC™ server CPUs. TDISP helps establish trust for GPUs and other devices connected to a trusted execution environment, which has the potential to improve I/O performance. AMD SEV is the foundation for confidential AI across cloud, hardware, and software providers, and now it will also support TDISP endpoints. I can’t wait to see what it does next to unlock the potential of AI.
If you’d like to learn more about AMD SEV and explore ways to get started, please visit amd.com/confidentialcomputing.
Footnotes
EPYC-056 - Confidential Computing on EPYC is enabled by the SEV security feature, which was introduced with 1st Generation EPYC in 2017. 2nd Gen EPYC powered the first confidential computing cloud instance in Google Cloud in 2020. EPYC: powers the highest number of confidential VM options available on all major CSPs; Supports both host and guest in the Linux Kernel; Is available on all major Linux Distributions; Has support on VMware; Supports confidential containers.
EPYC-056 - Confidential Computing on EPYC is enabled by the SEV security feature, which was introduced with 1st Generation EPYC in 2017. 2nd Gen EPYC powered the first confidential computing cloud instance in Google Cloud in 2020. EPYC: powers the highest number of confidential VM options available on all major CSPs; Supports both host and guest in the Linux Kernel; Is available on all major Linux Distributions; Has support on VMware; Supports confidential containers.