The Linux Foundation Projects
Skip to main content
All Posts By

jshelby

Strengthening Multi-Cloud Security: The Role of COCONUT-SVSM in Confidential Virtual Machines

By Blog No Comments

By Sal Kimich

Introduction:

As businesses increasingly adopt multi-cloud environments to run their critical workloads, ensuring data security and compliance with regional privacy regulations becomes paramount. The proliferation of sensitive workloads across different cloud providers raises concerns about the safety of data, particularly in virtualized environments where virtual machines (VMs) handle vast amounts of personal and regulated data.

This is where COCONUT-SVSM (Secure Virtual Machine Service Module) shines. Designed to provide secure services and device emulations for confidential virtual machines (CVMs), COCONUT-SVSM ensures that sensitive workloads remain secure, even in distributed or potentially untrusted cloud environments. In this blog, we will explore the value of COCONUT-SVSM in safeguarding virtualized workloads, highlighting how it strengthens multi-cloud security.

Why Secure Virtual Machines Matter in Multi-Cloud Environments

Virtual machines (VMs) are a critical part of the modern cloud infrastructure, enabling organizations to efficiently allocate resources and scale their operations. However, traditional VMs are vulnerable to attacks from both external threats and privileged insiders, especially when data is processed in the cloud.

In multi-cloud environments, workloads can span multiple cloud providers, making it difficult to ensure that each environment is secure. This is where confidential computing and technologies like COCONUT-SVSM come into play. By creating confidential virtual machines (CVMs), organizations can isolate sensitive workloads from the underlying host operating system, ensuring that data remains protected, even if the host is compromised.

The Architecture of COCONUT-SVSM: Providing Security for Confidential VMs

At the heart of COCONUT-SVSM is its ability to provide secure services to CVMs through device emulations and remote attestation. These features enable organizations to run sensitive workloads with the assurance that both the data and the virtual machine environment are secure from unauthorized access.

Key features of COCONUT-SVSM include:

  • TPM Emulation: Emulating a Trusted Platform Module (TPM), COCONUT-SVSM enables secure key management and encryption within the virtual machine.
  • Secure Boot: Using UEFI variable storage, COCONUT-SVSM ensures that VMs can only boot in secure environments, preventing malicious actors from modifying the boot process.
  • Live Migration Support: In multi-cloud environments, VMs often need to be moved between physical hosts. COCONUT-SVSM supports secure live migration, ensuring that sensitive data remains protected during transitions.

These features help organizations comply with strict data privacy regulations, such as GDPR and CCPA, by maintaining control over how and where sensitive data is processed.

How COCONUT-SVSM Enhances Compliance in Multi-Cloud Systems

Compliance with data sovereignty and privacy regulations is a major challenge for organizations operating across multiple jurisdictions. For example, regulations like GDPR mandate that personal data is processed and stored within specific geographic boundaries, while ensuring that security controls are in place to prevent unauthorized access.

COCONUT-SVSM enhances compliance by ensuring that data processed in confidential virtual machines is always secured, regardless of where the data is physically located. This is particularly important for businesses with operations in multiple regions, as it allows them to securely process sensitive workloads while adhering to local regulations.

Additionally, remote attestation provided by COCONUT-SVSM ensures that workloads are only processed in trusted environments, providing an additional layer of security for organizations handling sensitive data.

Real-World Applications: COCONUT-SVSM in Healthcare and Finance

The healthcare and finance sectors are two prime examples of industries that can benefit from the enhanced security provided by COCONUT-SVSM. Both industries handle vast amounts of personal and financial data, making security and compliance critical to their operations.

  • Healthcare: In healthcare, COCONUT-SVSM can be used to protect sensitive patient data during AI-driven diagnostics or clinical trials. By creating secure environments for processing healthcare data, COCONUT-SVSM helps healthcare providers comply with regulations like HIPAA while ensuring that patient privacy is maintained.
  • Finance: In the financial sector, COCONUT-SVSM can be used to secure fraud detection models or other sensitive financial operations. By protecting virtual machines used to process financial transactions, COCONUT-SVSM helps financial institutions comply with PCI-DSS standards and other financial regulations.

COCONUT-SVSM as a Pillar of Multi-Cloud Security

As organizations continue to embrace multi-cloud strategies, the importance of securing virtualized environments cannot be overstated. COCONUT-SVSM provides the tools needed to ensure that confidential virtual machines (CVMs) remain secure and compliant, even when workloads are distributed across multiple cloud providers.

By leveraging features like TPM emulation, secure boot, and remote attestation, COCONUT-SVSM enables organizations to maintain control over their data and adhere to data sovereignty regulations, making it an essential part of any confidential computing strategy. As industries like healthcare and finance continue to handle sensitive data, COCONUT-SVSM will play a critical role in protecting workloads and ensuring compliance in multi-cloud environments.

Hyperlinks Summary:

 

Post-Quantum Cryptography: Preparing for a Quantum Future

By Uncategorized No Comments

Author:  Sal Kimmich

In a recent presentation to the Confidential Computing Consortium’s Technical Advisory Committee, Hart Montgomery discussed the pressing topic of post-quantum cryptography (PQC). The presentation highlighted the looming threat posed by quantum computers to traditional public key cryptography and outlined the proactive steps necessary to secure digital information in a post-quantum world.

The Quantum Threat

Montgomery began by addressing the fundamental issue: quantum computers, once sufficiently powerful, will be able to break nearly all existing widely deployed public key cryptography methods. These methods include widely used standards like RSA, DSA, and elliptic curve cryptography (including ECDSA). The crux of the problem is that quantum computers can solve complex mathematical problems—such as factoring large numbers and the discrete logarithm problem—exponentially faster than classical computers, rendering current cryptographic techniques vulnerable. ​ 

  • RSA (Rivest–Shamir–Adleman): A widely used public-key cryptosystem that relies on the difficulty of factoring large integers. Learn more about RSA.
  • DSA (Digital Signature Algorithm): A Federal Information Processing Standard for digital signatures, based on the difficulty of solving discrete logarithms. Learn more about DSA.
  • ECDSA (Elliptic Curve Digital Signature Algorithm): A cryptographic algorithm used by many standards for digital signatures that relies on the hardness of discrete logarithm over elliptic curves. Learn more about ECDSA.

Why Does This Matter?

The implications of quantum computers’ ability to break these cryptographic methods are far-reaching. A particularly concerning scenario is the “harvest now, decrypt later” problem, where adversaries could intercept and store encrypted data today, only to decrypt it in the future when quantum computing is sufficiently advanced. This is especially problematic for sectors like finance, where regulations often require data to be secure for decades. All experts queried by Global Risk Institute’s 2023 Quantum Threat report agreed that shift is likely to occur within the next 3 decades. 

The Power of Quantum Computing

To better understand the quantum threat, Montgomery provided a brief overview of quantum computing’s capabilities. Quantum computers operate using quantum bits, or qubits, which can exist in a superposition of states, allowing for massive parallelism in some computations. This property enables quantum algorithms, such as Shor’s algorithm, to solve problems like integer factorization exponentially faster than classical algorithms.

Shor’s algorithm, in particular, presents a significant threat to cryptography. It can factor large numbers exponentially faster than the best-known classical algorithms, such as the General Number Field Sieve (GNFS). For example, while classical algorithms might take an impractically long time to factor a 1,000-digit number, a quantum computer running Shor’s algorithm could potentially do so in a feasible amount of time.

  • Quantum Superposition: A fundamental principle of quantum mechanics where a quantum system can exist in multiple states simultaneously. Learn more about superposition.
  • Shor’s Algorithm: A quantum algorithm that can efficiently factorize large integers, threatening current public-key cryptographic systems. Learn more about Shor’s Algorithm.
  • General Number Field Sieve (GNFS): The most efficient classical algorithm for factoring large integers. Learn more about GNFS.

Quantum-Safe Cryptography

To counter the quantum threat, the cryptographic community has been developing quantum-safe cryptographic algorithms. These new methods are based on mathematical problems that are believed to be resistant to quantum attacks. One of the leading approaches is lattice-based cryptography, which involves complex mathematical structures known as lattices.

Montgomery emphasized the importance of transitioning to quantum-safe cryptography well before quantum computers reach a stage where they can break existing cryptographic systems. The timeline for the advent of quantum computers remains uncertain, with experts estimating that powerful quantum computers could emerge within the next 15 to 30 years. For organizations that need to secure data for extended periods, the shift to quantum-safe methods is urgent.

Standardization Efforts and Challenges

Montgomery highlighted the extensive efforts to standardize post-quantum cryptography. The National Institute of Standards and Technology (NIST) has been leading a global initiative to develop and evaluate quantum-safe algorithms. This process has involved rigorous review and testing by cryptographers worldwide. The first set of standardized algorithms were released in August 2024, with four key candidates emerging: Kyber, Dilithium, and Sphincs+.

While these algorithms offer security against quantum attacks, they also introduce challenges. One significant issue is the larger key sizes and computational overhead associated with these new methods. For example, lattice-based schemes like Kyber and Dilithium require larger keys and ciphertexts, which could impact performance in certain applications, particularly those involving large-scale or high-frequency cryptographic operations.

  • NIST (National Institute of Standards and Technology): A U.S. federal agency that develops and promotes measurement standards, including cryptographic standards. Learn more about NIST.
  • Kyber: A lattice-based key encapsulation mechanism (KEM) designed for post-quantum security. Learn more about Kyber.
  • Dilithium: A lattice-based digital signature algorithm designed for post-quantum security. Learn more about Dilithium.
  • Falcon: A compact lattice-based digital signature scheme optimized for post-quantum security. Learn more about Falcon. (standards still developing)
  • Sphincs+: A stateless hash-based digital signature scheme that provides post-quantum security. Learn more about Sphincs+.

Impact on Confidential Computing

The discussion also touched on the implications for confidential computing, particularly in areas like attestation, which heavily relies on cryptographic methods. Attestation is a critical component in confidential computing, used to verify the integrity and authenticity of a system or software environment. 

Montgomery noted that while the transition to post-quantum cryptography will require careful planning, many aspects of confidential computing, such as firmware and microcode, may not require significant hardware changes to implement quantum-safe cryptographic algorithms.

However, he did caution that the larger key sizes and ciphertexts associated with post-quantum cryptography could pose challenges in scenarios where numerous attestations (process of verifying the integrity and authenticity of a computing environment) or key exchanges occur frequently. Despite these challenges, the transition is crucial to ensure the long-term security of confidential computing environments.

The Post-Quantum Cryptography Alliance

To further advance the adoption of quantum-safe cryptography, Montgomery introduced the Post-Quantum Cryptography Alliance, The alliance’s goal is to build high-quality, quantum-safe cryptographic code and foster collaboration between the research community and developers to refine cryptographic algorithms that are resistant to quantum attacks. The alliance is structured similarly to other Linux Foundation projects, with an emphasis on open collaboration and transparency. Two key projects within the alliance are the Open Quantum Safe (OQS) project and the PQ Code Package project. OQS focuses on the development and implementation of quantum-safe algorithms, while the PQ Code Package project is dedicated to creating formally verified, high-assurance implementations of quantum-safe standards like Kyber.

Looking towards the Quantum Computing Era 

Post-Quantum Cryptography (PQC) addresses the quantum threat by developing cryptographic algorithms that can withstand attacks from quantum computers, ensuring that encrypted data remains secure and that signatures cannot be forged. Meanwhile, Confidential Computing (CC) protects data in use through secure enclaves and hardware-based security features, safeguarding sensitive computations from unauthorized access. 

Together, PQC and CC provide a layered security approach that covers the entire data lifecycle—from protection at rest and in transit to safeguarding data during processing. As digital threats evolve, integrating both PQC and CC into security strategies is vital for organizations looking to future-proof their operations. These technologies are not just essential on their own; they complement each other, forming the foundation of tomorrow’s secure computing environment.

As we approach the era of quantum computing, the need for quantum-safe cryptography becomes increasingly urgent. Hart Montgomery’s presentation underscored the importance of proactive measures, including the development and standardization of post-quantum cryptographic methods. While challenges remain—such as increased computational overhead and larger key sizes—the work being done today will be crucial in securing our digital future against the quantum threat.

You can watch the entire discussion on the CCC youtube channel

Exploring Enclave SDKs: Enhancing Confidential Computing

By Blog No Comments

Author:  Sal Kimmich

 

In the realm of confidential computing, enclave SDKs play a pivotal role in ensuring secure and private execution environments. These software development kits provide developers with the necessary tools and frameworks to build, deploy, and manage applications that operate within enclaves. In this blog, we will explore three prominent open-source enclave SDKs: Open Enclave, Keystone, and Veracruz. Additionally, we will touch upon the Certifier Framework, which, while slightly different, contributes significantly to the landscape of confidential computing.

Open Enclave

Open Enclave is a versatile SDK that provides a unified API surface for creating enclaves on various Trusted Execution Environments (TEEs) such as Intel SGX and ARM TrustZone. Developed and maintained by a broad community, Open Enclave aims to simplify the development of secure applications by offering a consistent and portable interface across different hardware platforms.

Key Features of Open Enclave:

  • Cross-Platform Support: One of the standout features of Open Enclave is its ability to support multiple hardware architectures, making it a flexible choice for developers working in diverse environments.
  • Rich Documentation and Community Support: Open Enclave boasts extensive documentation and a supportive community, providing ample resources for developers to learn and troubleshoot.
  • Comprehensive Security Measures: The SDK incorporates robust security features, including memory encryption, attestation, and secure storage, ensuring that applications remain secure and tamper-resistant.

Keystone

Keystone is an open-source framework designed to provide secure enclaves on RISC-V architecture. It is highly modular and customizable, allowing developers to tailor the security features to meet the specific needs of their applications.

Key Features of Keystone:

  • Modularity: Keystone’s design philosophy revolves around modularity, enabling developers to customize the enclave’s components, such as the security monitor, runtime, and drivers.
  • RISC-V Architecture: Keystone is built specifically for the RISC-V architecture, leveraging its open and extensible nature to offer a unique and highly configurable enclave solution.
  • Research and Innovation: Keystone is often used in academic and research settings, driving innovation in the field of confidential computing and providing a platform for experimental security enhancements.

Veracruz

Veracruz is an open-source project that aims to create a collaborative computing environment where multiple parties can jointly compute over shared data without compromising privacy. It emphasizes data confidentiality and integrity, making it ideal for scenarios involving sensitive data.

Key Features of Veracruz:

  • Collaborative Computing: Veracruz enables secure multi-party computation, allowing different stakeholders to collaborate on computations without revealing their individual data.
  • Privacy-Preserving: The framework ensures that data remains confidential throughout the computation process, leveraging TEEs to provide strong privacy guarantees.
  • Flexible Deployment: Veracruz supports various deployment models, including cloud, edge, and on-premises, making it adaptable to different use cases and environments.

Certifier Framework: A Slightly Different Approach

While the Certifier Framework for Confidential Computing shares the goal of enhancing security and privacy in computational environments, it adopts a distinct approach compared to traditional enclave SDKs.

Certifier Framework focuses on providing a unified certification and attestation infrastructure for confidential computing environments. It aims to ensure that the software and hardware components in a system can be securely attested and certified, providing trust guarantees to end-users and applications.

Key Features of the Certifier Framework:

  • Certification and Attestation: The primary focus of the Certifier Framework is on certification and attestation, ensuring that all components of a confidential computing environment meet stringent security standards.
  • Unified Approach: The framework offers a unified approach to certification across different TEEs, simplifying the process of establishing trust in diverse environments.
  • Integration with Existing Solutions: The Certifier Framework can be integrated with other enclave SDKs and confidential computing solutions, enhancing their security posture through robust certification mechanisms.

Conclusion

Enclave SDKs like Open Enclave, Keystone, and Veracruz are critical tools for developers aiming to build secure and private applications in the realm of confidential computing. Each of these projects brings unique strengths and features to the table, catering to different hardware architectures and use cases. Meanwhile, the Certifier Framework provides an essential layer of trust and certification, complementing these SDKs and ensuring that confidential computing environments meet the highest security standards. By leveraging these powerful tools, developers can create innovative solutions that protect sensitive data and maintain user privacy in an increasingly digital world.

Confidential Computing Consortium Resources

Library OS for Confidential Computing: Enhancing Data Security with Cutting-Edge Projects

By Blog No Comments

Author:  Sal Kimmich

Introduction

As the landscape of data security continues to evolve, the concept of a Library OS (operating system) for Confidential Computing is gaining traction. Library OS projects create secure environments for applications by providing “auto” enclaves for process isolation. These enclaves, also known as runtimes or sandboxes, ensure that sensitive data remains protected even during processing. In this blog, we explore the significance of Library OS for confidential computing and highlight three key projects: Gramine, Occlum, and Enarx.

What is a Library OS?

A Library OS, or “libOS,” is a streamlined operating system that runs applications within secure enclaves. These enclaves isolate processes, providing a trusted execution environment (TEE) that safeguards data from unauthorized access and tampering. This approach is particularly valuable for confidential computing, where data must remain secure throughout its lifecycle, including during computation.

Key Projects in Library OS for Confidential Computing

Gramine
  • Overview: Gramine is an open-source Library OS designed to run applications in trusted execution environments. It supports Intel SGX and enables the secure execution of unmodified applications.
  • Features: Gramine provides robust security by isolating applications within enclaves, ensuring that data remains protected even if the underlying host is compromised. Its compatibility with existing applications makes it a versatile choice for enhancing data security.
  • GitHub: Gramine Project
Occlum
  • Overview: Occlum is a memory-safe, multi-process Library OS that supports Intel SGX. It aims to provide a secure and efficient environment for running applications within enclaves.
  • Features: Occlum ensures data confidentiality and integrity by isolating processes and providing strong security guarantees. Its design focuses on performance and scalability, making it suitable for a wide range of applications.
  • GitHub: Occlum Project
Enarx
  • Overview: While not a traditional Library OS, Enarx uses WebAssembly (Wasm) to provide similar benefits. It enables the secure execution of applications in TEEs, ensuring data privacy and integrity.
  • Features: Enarx leverages Wasm to create secure runtimes that can run across different hardware platforms. Its approach simplifies the deployment of secure applications, making it a compelling option for confidential computing.
  • GitHub: Enarx Project

The Importance of Library OS in Confidential Computing

Library OS projects like Gramine, Occlum, and Enarx play a crucial role in the realm of confidential computing. They offer a layer of security that ensures sensitive data remains protected during processing. By isolating applications within secure enclaves, these projects mitigate risks associated with data breaches and unauthorized access.

Conclusion

The concept of a Library OS for confidential computing represents a significant advancement in data security. Projects like Gramine, Occlum, and Enarx demonstrate the potential of this approach to enhance privacy and protect sensitive information. As the need for secure data processing continues to grow, these projects will play an increasingly vital role in ensuring the confidentiality and integrity of data in various applications.

Stay tuned for more insights into the world of confidential computing and the innovative projects that are driving this field forward.

Partisia Joins the Confidential Computing Consortium as a Start-up Tier Member

By Blog No Comments

We are pleased to welcome Partisia, a global pioneer in Multiparty Computation (MPC) and advanced cryptographic privacy, as a Start-up Tier member of the Confidential Computing Consortium (CCC). Their membership strengthens the CCC’s efforts to advance secure, privacy-preserving computing by bringing Partisia’s expertise in cutting-edge cryptographic solutions to the forefront of our initiatives.

Founded in 2008, Partisia has a long history of delivering commercial-grade MPC software solutions, with an initial focus on secure, high-stake auctions used for trading energy and spectrum licenses. Over the years, Partisia’s MPC solutions have evolved, becoming the foundation for various services, including key management, data activation, statistics, and various bespoke applications such as DeFi, voting, and e-cash.

Partisia’s commercial activities have also led to the creation of successful spinouts, such as Sepior, which was acquired by Blockdaemon in 2022, and the Partisia Blockchain Foundation. This Swiss-based foundation governs and launches a public blockchain built by Partisia.

By joining the Confidential Computing Consortium, Partisia aligns itself with a global community dedicated to defining and accelerating the adoption of confidential computing. This membership further solidifies Partisia’s commitment to addressing weak and single points of failure across digital infrastructures through commercializing advanced cryptographic technologies.

We eagerly anticipate the valuable contributions that Partisia will bring to the CCC and the broader tech community. As they continue to push the boundaries of secure, privacy-preserving computing, we are excited to see the innovative solutions they will develop.

Confidential Computing Consortium Resources

August Newsletter

By Newsletter No Comments

In Today’s Issue:

  1. Executive Director August Recap
  2. Agenda Released! CC Mini Summit @ OSSEU
  3. Post-Quantum Cryptography
  4. Web3 Use Case
  5. Community Blog Highlights

Welcome to the August edition of our newsletter – your guide to awesome happenings in our CCC community. Let’s go!

Executive Director August recap

While it’s holiday season in much of the Northern Hemisphere, the CCC’s work continues (uninterrupted even by the Olympics and Paralympics!), and as we’ve grown over the past few years, we’ve made the decision to continue Governing Board meetings throughout the year, instead of breaking for the (Northern) summer period.  The Governing Board manages the strategic and policy directions of the CCC, including budgetary decisions and the acceptance of new open-source projects into the Consortium.  Attendance is open to officers of the Consortium, Premier Member representatives, and the elected Governing Board representatives of the General Members.  Representatives from other committees typically attend and present the status of work in their respective areas and sometimes the Governing Board requests reports from other groups.

While keeping within the governance structure of the Consortium, we try to maintain a “minimal viable governance” approach.  Post-Covid (and changing travel budget constraints for many organizations), opportunities to meet in person have been reduced, so we are considering a face-to-face meeting (supplemented by video conferencing) at the Linux Foundation Member Summit in November: please let us know if you’re going to be there (even if you’re not a Premier member!).

One of the areas that the Governing Board has been keen to promote work on this year has been lowering barriers to the adoption of Confidential Computing.  One of these is the availability of Attestation Verification Services, which allow consumers of Confidential Computing services to gain the cryptographic assurances about the workloads they need.  Attestation is a core part of Confidential Computing, and the word “attested” was deliberately added to the CCC’s definition of Confidential Computing to reflect that:
“Confidential Computing is the protection of data in use by performing computation in a hardware-based, attested Trusted Execution Environment.”

The CCC has recently kicked off a piece of work to encourage discussion of business models around Attestation Verification Services and to help those considering providing or consuming them.  An initial discussion document has generated a great deal of input and the plan is to start a working group with online meetings later in August.  If you are interested in participating, please get in touch.

CC Mini Summit Agenda Announced!

Bringing EU Community Together

CCC is hosting the “Confidential Computing Mini Summit” at the Open Source Summit EU, Vienna Austria

  • 📢 Mini Summit Agenda
  • ⏰ Time: 13:30 – 17:00
  • 📍 Room 0.14 (level 0) – see floor plan here
  • 🎫 Mini Summit Registration Fee: $10
  • 💰 20% Discount Code for Main Summit: OSSEUCOLOSPK20
    (*Note: Registration for the main conference is required to attend the Mini Summit.)
  • Register Here

Post-Quantum Cryptography

Over the last few weeks at TAC meetings, we’ve been discussing the new evolution of cryptography called Post-Quantum Cryptography or PQC. As full-scale quantum computers become more and more likely, cryptographers have had to invent new algorithms that will remain secure against adversaries with new capabilities. In Confidential Computing, we rely on cryptography in a number of ways to protect workloads in use. As a trusted execution environment (TEE) starts we use cryptographic hash algorithms to fingerprint each component.

Later we use cryptographic signatures when the hardware attests to those measurements. While the workload is running the memory is protected with encryption and in some cases integrity provisions. Some of these algorithms are more impacted by quantum computing than others. Hardware vendors will need to update their algorithms. Software vendors may want to shield downstream adopters by carefully designing their APIs. If you are interested to learn more keep your eyes open for an upcoming blog on our Post Quantum Cryptography discussions or watch our Tech Talk.

TAC Tech Talk playlist 

Bringing EU Community Together

CCC is hosting the “Confidential Computing Mini Summit” at the Open Source Summit EU, Vienna Austria

  • 📢 Mini Summit Agenda
  • ⏰ Time: 13:30 – 17:00
  • 📍 Room 0.14 (level 0) – see floor plan here
  • 🎫 Mini Summit Registration Fee: $10
  • 💰 20% Discount Code for Main Summit: OSSEUCOLOSPK20
    (*Note: Registration for the main conference is required to attend the Mini Summit.)
  • Register Here

Web3 Use Case

Enabling Verifiable, User-Owned and Tradable AI Agents in Games – with Veriplay, Polygon, Immutable and Super Protocol

True Web3 Games, with their potential for rich gaming experiences, advanced AI agents, and genuine digital asset ownership, can only reach their full potential through the implementation of Confidential Computing in a truly decentralized manner. The Confidential Computing Consortium, alongside its member Super Protocol, is at the forefront of this revolution, demonstrating how these technologies can unlock new business opportunities.

Read the Full Use Case

Community Blog Highlights

July Newsletter

By Newsletter No Comments

In Today’s Issue:

  1. Executive Director July Recap
  2. The Case for Confidential Computing
  3. Community News
  4. OSS EU 2024, Confidential Computing Mini Summit

Welcome to the July edition of our newsletter – your guide to awesome happenings in our CCC community. Let’s go!

Executive Director July recap

Following the announcement of a 12-month free subscription to the CCC for new members of under 100 employees, we’ve had a steady stream of new members and it’s continuously growing! If you are a start-up and would like to get involved in the CCC’s work (or you know another organization that might be interested), please get in touch. You can find information about many of the benefits on our website.

This month, I went back in Asia, meeting members (and potential members) in South Korea and Singapore. The CCC sponsored the Privacy-Enhancing Technology Summit Asia-Pacific again this year and we had a fantastic turnout. Read the full recap blog here.

Having had the CC Summit in North America and the PET Summit in Singapore, we’re not about to leave out Europe, where we’re seeing increasing interest and traction for Confidential Computing. I led a panel discussion on CC for the European Central Bank with Parviz Peiravi from Intel and Felix Schuster from Edgeless Systems recently. And we’re also running a CC Mini-Summit at Open Source Summit in Vienna on the 19th September. No waltzes are promised, but there are opportunities to speak: still few more days to submit your talk! Mini Summit CFP

CCC’s Use Case Report is LIVE

As the collection, storage, and analysis of data become increasingly important across industries, businesses are looking for solutions that keep data secure and processes compliant with regulations. Confidential computing is one of these solutions, involving the use of a trusted execution environment that runs on shared infrastructure but processes data away from unauthorized users.

This use case report interviewed members of the confidential computing community on the ways they have implemented the technology and what they believe its future holds.

Read the Full Report

Community News

Meet us at Open Source Summit

Bringing EU Community Together

CCC is hosting the “Confidential Computing Mini Summit” at the Open Source Summit EU, Vienna Austria

  • ⏰ Time: 13:30 – 17:00
  • 🎫 Mini Summit Registration Fee: $10
  • 💰 20% Discount Code for Main Summit: OSSEUCOLOSPK20
    (*Note: Registration for the main conference is required to attend the Mini Summit.)
  • Register Here

Enabling Verifiable, User-Owned and Tradable AI Agents in Games – with Veriplay, Polygon, Immutable and Super Protocol

By Blog No Comments

Author:  Nukri Basharuli, Founder and CEO, Super Protocol

 

 

True Web3 Games, with their potential for rich gaming experiences, advanced AI agents, and genuine digital asset ownership, can only reach their full potential through the implementation of Confidential Computing in a truly decentralized manner. The Confidential Computing Consortium, alongside its member Super Protocol, is at the forefront of this revolution, demonstrating how these technologies can unlock new business opportunities.

Super Protocol serves a dual role in the evolving digital landscape. As a confidential and self-sovereign AI Cloud, it focuses on decentralization, privacy, and verifiability. Its computing network of commonly adopted types of GPU and CPU operates in a confidential mode under the orchestration of Smart Contracts on the Polygon blockchain. This makes Super a decentralized alternative to centralized clouds like Amazon AWS for Web3 AI projects. Additionally, as an AI Marketplace, Super Protocol differs from traditional AI marketplaces like Hugging Face by offering AI models and Data owners the unique ability to share and monetize their assets in a fully confidential, self-sovereign mode. The value of Super Protocol is well illustrated by the examples of its clients.

In this blog, we’ll explore how Super Protocol’s AI cloud is set to transform the gaming industry, enabling a secure and self-sovereign experience that could redefine the future of digital entertainment.

Example: Veriplay 

About Veriplay 

Veriplay is a startup that is developing a gaming platform compatible with Immutable and Polygon. This platform will enable creating AI agents in Web3 Games that can be traded on the open market as dynamic NFTs

The Veriplay team, with a proven track record of working with industry giants like Playrix, Warner Brothers, Google, and Crytek, is on a mission to revolutionize the gaming landscape by introducing verifiable and tradable AI agents

This innovative approach aims to address the limitations of traditional gaming experiences and empower players with unprecedented control over their in-game assets

Super Protocol and Veriplay research and testing efforts have utilized NVIDIA’s H100 GPUs, provided by the NVIDIA Confidential Computing team (more details in Super Protocol Press-release: 

https://www.linkedin.com/posts/superprotocol_confidentialcomputing-depin-nvidiainception -activity-7169336537371914242-LBV3?utm_source=share&utm_medium=member_desktop 

Project Goal 

Veriplay’s goal is to give players the ability to truly own AI game agents. To do so, it aims to develop a reliable, Web3-compatible gaming platform for integrating verifiable tokenized AI agents into games with the following characteristics: 

  • Player AI Models are Protected from Unauthorized Alterations: Veriplay wants to protect player AI models from any unauthorized modifications, whether initiated by the game developer or external malicious actors. 
  • AI Model Training is Verifiable: This means that it is possible to verify how the AI agents were trained, which guarantees their fairness and transparency. 
  • Decentralization (Smart Contract Orchestration): Smart contracts will govern the execution

of AI computations and data storage, ensuring transparency and immutability, and eliminating human administration layer. 

  • Free trading of AI NFTs on marketplaces : Veriplay revolutionizes AI agent ownership by transforming them into tradable digital assets managed by players through dynamic NFTs. 

It is evident that without AI computation privacy, verifiability of AI model training history, and smart contract management, the integrity of AI agents as digital assets will be irrevocably compromised, shattering market trust and leading to market rejection of such assets. 

The Centralized Infrastructure Problem 

There are several problems with creating trusted AI agents in centralized infrastructure, such as Amazon Web Services (AWS) or Google Cloud: 

  • Difficulty of Verification: It is difficult to verify that AI agents have been trained and operate according to the rules of the game declared by the developer. This is especially important when AI agents become tradable assets or when they are used in competitions with prize money. 
  • Risk of Developer Manipulation: Developers have the ability to alter or duplicate an AI agent trained by a player who has invested time and money into the training process. For instance, a developer could duplicate a successful model that frequently wins competitions and sell it to other players as this developer’s original creation. 
  • Player’s Inability to Own Agents: In centralized AI agent infrastructures, players lack true ownership of their agents, being confined to developer-defined capabilities and pricing models. While creating a simple NFT for AI agent ownership partially addresses this, it falls short of true self-custodial ownership. For this type of ownership to be achieved, the AI NFT must be dynamic, linked to all AI agent components, maintain security and verifiability, and prevent human administration access – all impossible within centralized frameworks. 

To sum it up, centralized infrastructure poses significant risks that not only diminish the value of players’ time and investments in training their agents, but also severely restrict monetization opportunities for both players and game studios. 

Conversely, with the implementation of trustless AI Agents in games, both players and developers could generate additional income by trading dynamic AI NFTs on marketplaces, renting out Agents to each other, ghosting and participating in the championships with cash prize funds, and more. 

Super Protocol and Veriplay Solution 

In contrast to AI agents confined within centralized infrastructures and under the complete control of game developers, Web3-compatible gaming agents powered by Super Protocol and the Veriplay platform will exhibit the following advantages:

  1. Confidentiality and Sovereignty of AI Agent – players retain exclusive sovereignty over their AI agents, encompassing models, data, and computational resources, effectively eliminating the possibility of third-party manipulation. 
  • Confidential Enclave Technology: Web3-compliant AI agents are computed in confidential enclaves. Confidential enclaves operate based on the Trusted Execution Environment (TEE) technology supported by Nvidia H100 GPU chips. TEE allows creating a secure area inside the processor for safe storage, processing, and protection of confidential data. Even physical access to the server will not grant access to the applications running in the enclaves. No one except the owner of the Agent knows on which servers their data is being processed, as TEE ensures complete isolation of sensitive data and models. 
  • Access and control over the system are only granted to the smart contract and verified applications loaded into it. The computational resources used for the Agent’s operation are automatically authenticated by the protocol, ensuring the user that they are processing their model and data securely. By design, these resources cannot be tampered with or exploited maliciously. The owner alone manages the model, data, and interactions of their gaming agent. 

As a result, users can be confident that the game developer cannot alter or copy the model since they do not have access to it. 

  1. Verifiability of AI Agent training and game interactions – maintaining the verifiability of the Agent throughout the chain from the storage to the server is guaranteed by the following functions: 
  • The client application and the server are mutually authenticated on a TLS connection. They exchange messages signed with a secret key. Messages contain information about the hardware, application, and its settings. 
  • After mutual authentication, the game application computing process initiates. The outcome will be signed using the enclave key, maintaining the chain of trust. This trust continuity prevents unauthorized alterations, ensuring players can rely on the computation’s result. 

Therefore, the verifiability of the AI Agent’s track record and its immutable nature reassure players that acquiring an AI Agent guarantees possession of an asset with the promised properties. Moreover, by investing in its continuous training, they can have confidence that the future market price of the Agent will accurately reflect their training endeavors. 

  1. The decentralization and removal of human administration are achieved through orchestration by a smart contract system. Via smart contracts, Super Protocol entirely separates the gaming process from server and cloud owners, guaranteeing trust, flexibility and reliability. 
  • Smart Contracts oversee the distribution of the system’s computing resources, assigning confidential nodes for computing tasks.
  • Supporting the necessary Service Level Agreement (SLA) and scalability is accomplished by grouping nodes into clusters and pools with automated Disaster Recovery (DR) mechanisms. 
  • Additionally, the capability to establish geo-distributed clusters with efficient local gateways is provided. 
  • The protocol also ensures secure storage through multiple network replication, encryption, and restricted access to trusted applications. 

With these features, the capacity to deploy a fault-tolerant game server, storage, and agents without dependence on specific hosts is achieved, embodying the finest decentralized cloud architecture available today. 

  1. Ownership management via NFTs and smart contracts is central to the project. The entire process of AI agents’ ownership management and the orchestration of the marketplace where they are traded, is exclusively governed by the project’s smart contracts. 

Each agent consists of on-chain data — an NFT with its own wallet, and data in storage, including a model and game interaction history. Any modifications to the model are made within the chain of trust, beginning with the initial record. These changes are exclusively executed through a smart contract, with each alteration recorded in the agent’s track record and the blockchain. 

Listing of an agent on the marketplace and transferring ownership is seamlessly and securely conducted through a smart contract, ensuring the safety of all participants. 

To sum it up, the deployment of a seamless and secure confidential, verifiable, decentralized computing service, combined with state and ownership management through smart contracts, creates a competitive and fair environment for AI agents across diverse activities. 

Uncompromised competition translates into value, meaning that an AI agent’s success in gaming tasks transforms it into an asset that accrues both the player’s time and money, along with the diversity and uniqueness of gaming scenarios it has encountered. 

NFT integration grants the agent autonomy, meaning that it can be traded and rented. Moreover, the agent becomes capable of possessing its own assets and making decisions independently, acting autonomously without external influence. 

The Rest of Web3 Infrastructure: Polygon and Immutable 

Alongside the Super Protocol, the Veriplay team chose Polygon and Immutable X technologies, merging these three platforms to establish a resilient Web3-compliant ecosystem for training and dynamically tokenizing AI Agents. On a high level, each solution has the following functions: 

  • Super Protocol provides the decentralized and confidential verifiable computing infrastructure necessary for widespread adoption of AI agents and dynamic NFTs in games. Leveraging TEE technology, Super ensures data and algorithms are safeguarded against

unauthorized access, manipulation, and attacks, crucial for establishing trusted and transparent web3 games. 

  • Polygon offers a fast and scalable blockchain, delivering high performance and low transaction fees. This enables efficient management of AI agents and dynamic NFTs, ensuring a seamless and cost-friendly gaming experience. Moreover, Veriplay, with its focus on a multichain future, seamlessly integrates with other EVM-compatible networks through Polygon’s multichain framework. Additionally, Polygon’s compatibility with the Ethereum Virtual Machine grants Veriplay direct access to the vast capabilities of the Ethereum ecosystem. This ensures not only smooth scaling but also opens doors to a wider range of opportunities within the Web3 space. 
  • Immutable X stands out as a premier NFT platform focused on gaming, offering scalability, low fees, and developer-friendly tools. These features simplify the integration of dynamic NFTs into games, requiring minimal cost and effort. 

Conclusion 

Super Protocol marks a new era in Web3 Games development, paving the way for innovative and immersive game worlds controlled by players and built on principles of trust and transparency. 

By enabling exclusive ownership of confidential, verifiable, and transferable AI Agents through dynamic NFTs, players can fully trust the authenticity of in-game assets. Moreover, investors in the open market can confidently invest in NFT assets backed by real AI models, sought after by players for gaming applications.

Attestation Libraries for Confidential Computing: Veraison and SPDM Tools

By Blog No Comments

Author:  Sal Kimmich

In the realm of confidential computing, ensuring trust and security in computing environments is paramount. Attestation libraries and tools provide essential components to build systems that can produce and verify evidence of trustworthiness. This blog explores the concept of attestation in confidential computing and highlights two significant projects within the Confidential Computing Consortium (CCC): Veraison and SPDM Tools.

What is Attestation in Confidential Computing?

Attestation is the process by which the hardware provides evidence about itself and the software running under its protection. Any other party can use this evidence to evaluate the trustworthiness of the Trusted Execution Environment. This process is critical in confidential computing to establish and maintain trust in computing environments, ensuring that sensitive data and operations are protected from unauthorized access and tampering.

Key Components of Attestation

  1. Evidence Generation:
    • The hardware (e.g., a device or CPU) generates evidence about its state, such as cryptographic measurements and signatures.
  2. Evidence Verification:
    • The verifier evaluates the provided evidence against a set of policies or reference values to determine the entity’s trustworthiness.
  3. Trust Anchors:
    • Cryptographic roots of trust (e.g., certificates) used to validate the identity.

Veraison: A Comprehensive Attestation Verification Service

Project Veraison builds software components to facilitate the creation of an Attestation Verification Service. Here’s how Veraison operates and its significance:

Overview

  • Purpose: Veraison aims to simplify the development of attestation verification services by providing reusable software components. These components include verification and provisioning pipelines that can be extended with plugins to support specific attestation technologies.
  • Flexibility: The project’s core components are designed to adapt to various deployment environments through abstractions, allowing for custom service creation without the need for extensive bespoke development.

Key Features

  1. Verification Pipelines:
    • Core structures for verifying attestation evidence, ensuring that it meets established trust policies.
  2. Provisioning Pipelines:
    • Components that manage the provisioning of data required for evidence appraisal, sourced from authoritative sources.
  3. Extensibility:
    • Support for plugins allows the service to handle various attestation technologies, making it versatile and adaptable to different use cases.
  4. Community and Collaboration:
    • Veraison is a collaborative project with active community involvement, including regular public meetings and contributions from multiple organizations.

Use Case: Veraison in Action

Veraison provides reference implementations to demonstrate integration principles, offering a convenient basis for developing substantive attestation verification services. These reference implementations showcase how the core components and plugins work together to create a robust verification system. 

Veraison also supports REST APIs to assist in end-to-end integration with attestation scemes, or can be used as verification components within a custom deployment. A great example of this is a key broker service, where successful attestation verification a key released to a Trusted Execution Environment. 

SPDM Tools: Enhancing Security with Attestation Protocols

SPDM (Security Protocol and Data Model) Tools offer libraries and utilities to implement the SPDM protocol, a standardized framework for secure communication and attestation between devices.

Overview

  • Purpose: SPDM Tools provide essential functionality for implementing the SPDM protocol, ensuring secure communication and attestation across various platforms.
  • Interoperability: The tools ensure interoperability between different devices and platforms, promoting a unified approach to security and attestation.

Key Features

  1. Protocol Implementation:
    • Comprehensive support for the SPDM protocol, enabling secure communication and attestation across various platforms.
  2. Utilities and Libraries:
    • A suite of tools and libraries that simplify the implementation and management of SPDM-based attestation solutions.
  3. Standardization:
    • By adhering to the SPDM standard, the tools promote consistency and reliability in attestation processes across different devices and environments.

Use Case: SPDM Tools in Secure Device Communication

SPDM Tools can establish secure communication channels between devices, ensuring that each device can verify the trustworthiness of the other before exchanging sensitive information. This capability is crucial in scenarios such as building a trusted channel between an accelerator device like a GPU and a Confidential Virtual Machine (CVM)..

SPDM-RS: A Rust Implementation for SPDM Protocols

SPDM-RS is a project within the CCC that provides a Rust language implementation of the SPDM, IDE_KM, and TDISP protocols. These protocols facilitate direct device assignment for Trusted Execution Environment I/O (TEE-I/O) in Confidential Computing.

Key Features

  1. SPDM Protocol Implementation:
    • Supports various SPDM requests and responses, including version negotiation, capability negotiation, algorithm negotiation, and more.
  2. IDE_KM and TDISP Protocols:
    • Implements protocols for secure communication and device management, enhancing the trust boundary of Confidential Virtual Machines (CVMs).
  3. Cryptographic Algorithm Support:
    • Includes support for cryptographic algorithms such as SHA-256/384/512, RSA, ECDSA, AES-GCM, and ChaCha20Poly1305.
  4. Cross-Platform Support:
    • Designed to work across different platforms, ensuring broad applicability in various confidential computing scenarios.

Conclusion

Attestation libraries and tools are vital for ensuring the trustworthiness of confidential computing environments. Projects like Veraison and SPDM Tools within the Confidential Computing Consortium provide essential components for building robust attestation solutions. By leveraging these tools, developers can create systems that securely verify and manage trust, protecting sensitive data and operations from potential threats.

Fr0ntierX Joins the Confidential Computing Consortium as a Startup Member

By Announcement No Comments

 

August 26, 2024 – Fr0ntierX, a leader in secure AI and cybersecurity, has officially joined the Confidential Computing Consortium. This recognition, driven by Fr0ntierX’s cutting-edge Janus platform, marks a significant milestone for the company.

Janus offers a novel approach to secure AI through confidential computing. This technology ensures complete data encryption at every level, making it indispensable for industries requiring top-tier security.

Fr0ntierX’s inclusion in the Consortium underscores its commitment to advancing secure computing in collaboration with the industry’s best.

“This community is unique. Nowhere else do you have competing companies come together with a shared goal of advancing the industry together. For us, it’s an incredible opportunity to integrate Janus with new ideas, ensuring our solutions continue to meet the highest standards,” said Jonathan Begg, CEO of Fr0ntierX. 

With a team of industry experts, Ph.D.s, and strategic advisors, Fr0ntierX provides guidance and support to help businesses maximize the benefits of AI adoption while maintaining the highest standards of security and compliance.

Fr0ntierX empowers enterprises, government agencies, and academic institutions to leverage the power of AI and Large Language Models (LLMs) without compromising security. Their flagship product, Janus, features advanced encryption and robust cybersecurity – powered by confidential computing – safeguarding data from storage to processing. By eliminating master keys, Janus mitigates common threats and ensures data integrity. Unlike typical AI models, which may expose data to third-parties, Janus operates within a fully isolated environment, providing a secure container for AI workflows and the compartmentalization of context data, making it ideal for sectors that handle sensitive information.

By joining the Confidential Computing Consortium, Fr0ntierX aims to further accelerate innovation in secure computing by collaborating with industry leaders to drive the adoption of confidential computing technologies.

Confidential Computing Consortium Resources