Post-Quantum Encryption
How Lattix applies post-quantum protection to protected data, key access, and trust artifacts — strengthening the asymmetric and trust-bearing layers around the protected object.
How Lattix applies post-quantum protection to protected data, key access, and trust artifacts.
Long-lived sensitive data can remain valuable far beyond the lifetime of the cryptographic assumptions used to protect it. A system that protects data only for present-day conditions can still fail if adversaries capture encrypted material now and retain it for future decryption. This is the harvest-now-decrypt-later problem.
Lattix addresses that risk by applying post-quantum protection to the key-access, signing, and policy-bound trust layers around protected data. The goal is not to change the object-centric security model, but to strengthen the cryptographic protections that govern who can access protected objects, how trust is established, and how those requirements remain attached to the data over time.
The Lattix approach
Lattix does not treat post-quantum protection as a separate subsystem bolted onto the side of the platform. It fits into the same protected-object and hierarchical key model used for every other security function.
- A protected object remains encrypted as ciphertext under its own object key.
- Access to that object is governed through higher-level wrapped key material and recipient access paths.
- Policies and security metadata bind cryptographic requirements to the object.
- Signatures protect the trust artifacts that authorize and describe access.
Post-quantum protection strengthens the control and access layers around the data without changing the way protected objects move across clouds, endpoints, edge nodes, and trusted partner environments.
Where post-quantum protection applies
Post-quantum protection applies to the asymmetric and trust-bearing layers around the protected object, not to the object model itself. The four places it lands are the access path, signatures on trust artifacts, the policy binding that holds the profile requirement, and the portable property that keeps those requirements attached as the object moves.
Key protection and recipient access
Every protected object carries encrypted payload data and a protected access path to the key material needed for authorized use. Post-quantum key encapsulation strengthens the cryptographic protection applied to that access path so long-lived protected data is less exposed to future cryptanalytic advances.
Trust artifacts and signatures
Access decisions, assertions, manifests, and related control artifacts depend on signatures to preserve authenticity and integrity. Post-quantum signatures extend that protection to the trust-bearing artifacts that describe and authorize access to protected data.
Policy-bound metadata
Cryptographic requirements are not only runtime configuration. They can be represented as part of the protected object's authoritative security metadata so the required protection profile remains attached to the data wherever it moves.
Cross-boundary access
When protected data moves into a trusted partner environment, the source organization's cryptographic requirements remain attached to the object. The receiving environment validates and enforces those requirements instead of stripping or replacing them, reducing the risk that protection weakens as data crosses organizational boundaries.
Why this matters
Most security architectures protect the environment around the data: the network, the application, the storage boundary, or the identity perimeter. Lattix protects the object itself. That means the cryptographic requirements can travel with the protected data rather than depending on a specific network location or infrastructure stack.
Post-quantum protection is especially important for long-lived data, regulated archives, intelligence holdings, and partner-shared information that may remain sensitive across years of storage, replication, and reuse. In those cases, the risk is not only who can access the data today, but whether the cryptographic protections around the data will remain credible over time.
Supported post-quantum mechanisms
Lattix supports post-quantum cryptographic profiles as part of its broader protected-object model.
Supported profiles may include:
- ML-KEM for post-quantum key encapsulation and key protection.
- ML-DSA for post-quantum digital signatures.
- Classical algorithms where compatibility still requires them.
- Hybrid profiles for staged migration where both classical and post-quantum protection are required during transition.
Payload confidentiality remains anchored in symmetric encryption. Post-quantum transition primarily affects the asymmetric layers used for key encapsulation, key protection, and digital signatures.
Post-quantum profiles are governed through the same tenant and policy model used elsewhere in the platform, so they can be applied selectively based on classification, retention horizon, trust boundary, or mission context.
Profiles and transition posture
Post-quantum protection is not an all-or-nothing switch. Different protected objects and workloads can require different cryptographic postures.
- A tenant may require stronger post-quantum protection for long-lived sensitive data.
- A transition profile may combine classical and post-quantum protections during a staged migration period.
- Short-lived or lower-risk workflows may remain on simpler profiles where policy permits.
The important architectural point is that these choices do not create a separate protection model. They are expressed through cryptographic profiles that bind directly to the same protected-object, policy, and key-access architecture used throughout Lattix.
How PQ requirements stay with the data
One of the strongest properties of the Lattix model is that cryptographic requirements can remain bound to the protected object itself.
- The payload remains ciphertext.
- The object's key-access path remains protected.
- The object carries security metadata that describes the required protection posture.
- Any environment attempting authorized access must validate and satisfy those requirements.
This prevents cryptographic requirements from becoming local-only configuration that is lost when data is copied, shared, cached, or moved between organizations.
Protected sharing across trusted organizations
Each organization operates within its own security domain, but trusted fabrics can validate and enforce source-origin protections on shared data. When Org A protects data under a post-quantum profile and shares it with a trusted Org B environment, the cryptographic requirements set by Org A remain attached to the protected object. Org B can validate the manifest, trust metadata, and access requirements without taking ownership away from the source controls that govern the object.
This reduces the risk that data loses protection fidelity as it moves across coalition, partner, or mission boundaries.
Relationship to the rest of the platform
Post-quantum protection is not isolated from the rest of the Lattix architecture.
- The Hierarchical Key Model governs how protected-object access is structured.
- The Trusted Data Format protected object carries the ciphertext, protected key-access path, and profile metadata.
- The Key Access Service mediates governed access to protected key material.
- Policies and ABAC determine which protection requirements apply to which data and which requesters.
- The Immutable Ledger records the trust and access events associated with protected-object use.
- Cross-fabric trust in the Zero Trust Fabric preserves source-origin cryptographic requirements as protected data moves between trusted organizations.
- Cryptographic Agility explains how administrators manage profile changes, key lifecycles, epochs, rewrap campaigns, and observable transition events over time.
Post-quantum protection in Lattix is not only about stronger algorithms. It is about preserving cryptographic trust around protected data as that data moves across time, infrastructure, and organizational boundaries.
Cryptographic Agility
How Lattix helps administrators manage cryptographic transition for protected data over time — tenant profiles, key lifecycles, observable migration events, and rewrap campaigns.
Classification and Tagging
How data becomes policy-enforceable — tag schemas, classification automation, and the handoff from governance to access control.