#ML-KEM post-quantum
Explore tagged Tumblr posts
Text
ML-KEM post-quantum TLS in AWS KMS, ACM, And AWS SM

ML-KEM post-quantum TLS supports Secrets Manager, ACM, and AWS KMS.
Amazon Web Services (AWS) is pleased to announce that three services now implement the latest hybrid post-quantum key agreement TLS standards. Each AWS Region supports ML-KEM for hybrid post-quantum key agreement in non-FIPS endpoints via AWS KMS, ACM, and AWS Secrets Manager. Hybrid post-quantum key agreement is optional for the AWS Secrets Manager Agent, based on the Rust SDK. Customers can incorporate secrets in apps using end-to-end post-quantum TLS.
Three AWS security-critical services with the biggest post-quantum secrecy demand were chosen. These three AWS services served CRYSTALS-Kyber, ML-KEM's predecessor. ML-KEM will replace CRYSTALS-Kyber across all AWS service endpoints in 2026.
AWS use of post-quantum encryption
AWS will follow its post-quantum cryptography migration approach. AWS plans to offer ML-KEM across all HTTPS-endpointed AWS services in the next years as part of this pledge and the post-quantum shared responsibility paradigm. Customers connecting to AWS HTTPS endpoints must upgrade their TLS clients and SDKs to enable ML-KEM. This will prevent further harvesting and identify quantum computing risks later. When clients provide ML-KEM, AWS will let HTTPS endpoints pick.
AWS's open-source FIPS-140-3-validated cryptographic library, AWS Libcrypto (AWS-LC), and its open-source TLS implementation, s2n-tls, used across AWS service HTTPS endpoints facilitate hybrid post-quantum key agreement negotiations.
Effect of hybrid post-quantum ML-KEM on TLS performance
An ECDH+ML-KEM hybrid key agreement requires the TLS handshake to deliver more data and perform more cryptographic operations than an ECDH-only key agreement. Switching from a classical to a hybrid post-quantum key agreement will add 1600 bytes to the TLS handshake. ML-KEM cryptographic procedures need 80–150 microseconds more processing time. The initial TLS connection fee is subtracted from the total HTTP requests sent throughout the connection's lifetime.
AWS wants TLS to migrate smoothly to hybrid post-quantum key agreement. To help customers understand the impacts of hybrid post-quantum key agreement with ML-KEM, this study benchmarks typical workloads.
AWS evaluated the number of AWS KMS GenerateDataKey requests per second a single thread can transmit serially between an Amazon EC2 C6in.metal client and the public AWS KMS endpoint using the AWS SDK for Java v2. The client and server were in US-West-2.
Classical TLS connections to AWS KMS utilised the P256 elliptic curve for key agreement, whereas hybrid post-quantum TLS connections used the X25519 with ML-KEM-768. Your performance characteristics will vary depending on your instance type, workload profiles, parallelism and thread utilisation, network location, and capabilities. HTTP request transaction rates were measured with active and disabled TLS connection reuse.
AWS found that hybrid post-quantum TLS with regular SDK configuration settings has negligible performance impact. AWS testing showed that hybrid post-quantum TLS lowered maximum TPS rate by 0.05 percent for a default-case sample workload. The maximum TPS rate was only lowered by 2.3% when SDK settings were modified to force a fresh TLS handshake for each request, according to AWS.
AWS service endpoints that support CRYSTALS-Kyber—ML-KEM's predecessor—will continue to do so through 2025. We will progressively stop supporting CRYSTALS-Kyber installations after clients move to ML-KEM. If they use earlier Java SDKs with CRYSTALS-Kyber support, customers should upgrade to the latest AWS SDK with ML-KEM support. Customers using the general release of the AWS SDK for Java v2 can upgrade from CRYSTALS-Kyber to ML-KEM without code modifications.
After CRYSTALS-Kyber is removed from AWS service HTTPS endpoints in 2026, users that are actively negotiating it and do not upgrade their AWS Java SDK v2 clients will have their clients gradually regress to a traditional key agreement.
A hybrid post-quantum key agreement
If you're using the AWS SDK for Rust, add the rustls package to your crate and enable the prefer-post-quantum feature flag to enable the hybrid key agreement.
Call.postQuantumTlsEnabled(true) to enable hybrid post-quantum key agreement in your AWS Common Runtime HTTP client using the AWS SDK for Java 2.x.
First, add the AWS Common Runtime HTTP client to your Java dependencies.
Your Maven dependencies should include AWS Common Runtime HTTP client. The latest version is recommended by AWS. To utilise ML-KEM, require 2.30.22 or later.
Second, enable post-quantum TLS in your Java SDK client.
Set up your AWS service client with the post-quantum TLS-configured AwsCrtAsyncHttpClient.
Things to try
Tips for utilising this post-quantum client:
Benchmark and load test: AWSCrtAsyncHttpClient uses AWS Libcrypto on Linux and is performance-optimized. If you haven't seen the performance gains over the SDK HTTP client by default, use the AwsCrtAsyncHttpClient. After using AwsCrtAsyncHttpClient, activate post-quantum TLS.
Try joining from several networks: Your request may be refused by intermediary hosts, proxies, or DPI firewalls based on its network path. Your IT administrators or security team may need to update your firewalls to unblock these new TLS methods.
In conclusion
ML-KEM hybrid key agreements are supported by three security-critical AWS service endpoints. When TLS connection reuse is allowed, hybrid post-quantum TLS may have no performance impact. AWS testing showed that executing AWS KMS GenerateDataKey only reduced maximum transactions per second by 0.05 percent.
As of version 2.30.22, the AWS SDK for Java v2 enables ML-KEM-based hybrid key agreement for Linux-based AWS Common Runtime HTTP clients. Turn on TLS post quantum key agreement in your Java SDK client setup immediately.
AWS plans to offer ML-KEM-based hybrid post-quantum key agreement on all AWS service HTTPS endpoints over the next several years as part of its cryptography migration strategy. AWS customers must update their TLS clients and SDKs to ensure ML-KEM key agreement when connecting to AWS service HTTPS endpoints. This will prevent further harvesting and identify quantum computing risks later.
#technology#technews#govindhtech#news#technologynews#cloud computing#ML-KEM#AWS Secrets Manager#ML-KEM post-quantum#post-quantum#hybrid post-quantum#post-quantum TLS
0 notes
Text
IBM-Developed Algorithms Announced as NIST's First Published Post-Quantum Cryptography Standards
MANILA – Two IBM-developed algorithms (NYSE: IBM) have been officially published among the first three post-quantum cryptography standards, announced by the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST). The standards include three post-quantum cryptographic algorithms: two of them, ML-KEM (originally known as CRYSTALS-Kyber) and ML-DSA (originally…
0 notes
Text
Google Chrome Switches to ML-KEM for Post-Quantum Cryptography Defense
http://i.securitythinkingcap.com/TDLFBM
0 notes
Text
Microsoft PQC ML-KEM, ML-DSA algorithms for windows & Linux

Microsoft has made significant progress in post-quantum cryptography (PQC) with SymCrypt-OpenSSL version 1.9.0 for Linux and Windows Insiders (Canary Channel Build 27852 and higher). This modification allows customers to test PQC algorithms like ML-KEM and ML-DSA in actual operational situations. Linux and Windows Insiders Get Quantum-Resistant Cryptography.
Due to quantum computing, modern cryptography faces significant challenges. Microsoft is providing early access to PQC capabilities to help organisations evaluate the performance, interoperability, and integration of these novel algorithms with current security infrastructure. This pragmatic approach helps security teams identify challenges, refine implementation strategies, and ease the transition when industry standards evolve. Early adoption also helps prevent new vulnerabilities and protect private data from quantum threats.
Next-generation cryptography API update
Cryptography API: Next Generation (CNG) enhancements are crucial to this Windows edition. CryptoAPI will be superseded forever by CNG. It is extendable and cryptography-independent. Programmers designing programs that allow safe data production and sharing, especially across insecure channels like the Internet, use CNG. CNG developers should know C, C++, and Windows, though it's not required. Cryptography and security knowledge are also advised.
Developers designing CNG cryptographic algorithm or key storage providers must download Microsoft's Cryptographic Provider Development Kit. First to support CNG are Windows Server 2008 and Vista. The latest PQC upgrades use encrypted communications, CNG libraries, and certificates.
New Windows PQC Algorithms
Microsoft is providing ML-KEM and ML-DSA, two NIST-standardized algorithms, to Windows Insiders via CNG updates.
Developers can now try ML-KEM for public key encapsulation and key exchange. This helps prepare for the “harvest now, decrypt later” scenario, in which hackers store encrypted data now to use a quantum computer to decipher it tomorrow. Microsoft proposes a hybrid method that combines ML-KEM with RSA or ECDH for defence in depth throughout the transition, ideally with NIST security level 3 or higher.
By incorporating ML-DSA in CNG, developers can evaluate PQC algorithms for digital signature verification of identity, integrity, or authenticity. Microsoft recommends a hybrid approach, using ML-DSA alongside RSA or ECDSA throughout the transition.
Size and performance will affect these new algorithms, according to preliminary research. Customers should analyse these consequences on their environment and apps early.
Customers can test installing, importing, and exporting ML-DSA certificates to and from the certificate store and CNG and PQC updates using the Windows certificate API interface win crypt. PQ certificate chains and trust status can be verified.
PQC Linux Features
Microsoft is releasing PQC upgrades in the SymCrypt provider for OpenSSL 3 because Linux customers expect them. The provider allows Linux programmers to use OpenSSL's API surface, which uses SymCrypt cryptographic procedures.
The latest IETF internet draft recommends SymCrypt-OpenSSL 1.9.0 for TLS hybrid key exchange testing. This lets you prepare for “harvest now, decrypt later” risks early. This feature allows for a full study of how hybrid PQC algorithms affect handshake message length, TLS handshake delay, and connection efficiency. Such research are needed to understand PQC's actual trade-offs.
It is important to remember that SymCrypt-OpenSSL will be updated when standards change to ensure compliance and compatibility, and that Linux updates are based on draft specifications.
What Next?
PQC's Linux and Windows Insider integration must be described first.
Plans call for more features and improvements:
Upcoming efforts include adding SLH-DSA to SymCrypt, CNG, and SymCrypt-OpenSSL.
Add new algorithms to assure broad compatibility as PQC standards expand, improve security, and comply with international law.
Working with industry partners on X.509 standardisations for the IETF's LAMPS working group's broad use of ML-DSA algorithm, composite ML-DSA, SLH-DSA, ML-KEM, and LMS/XMSS. These efforts will involve PKI use cases and signature approaches for firmware and software signing.
TLS hybrid key exchange for Windows users is being implemented using the Windows TLS stack (Schannel).
Develop and standardise quantum-safe authentication methods for TLS and other IETF protocols including SLH-DSA, Composite ML-DSA, and pure ML-DSA with the IETF. SymCrypt for OpenSSL, Windows TLS stack (Schannel), and Linux Rust Wrapper will deliver standards as they are established.
Active Directory Certificate Services actively supports PQC. Customers setting up a Certification Authority (CA) can use ML-DSA-based CA certificates. PQC algorithms sign CA-issued CRLs for customers who enrol in end-entity certificates. We'll support all ADCS role services.
Supporting PQC certificates in Microsoft Intune's Certificate Connector lets endpoints and mobile devices sign up for quantum-safe credentials. This will unlock SCEP & PKCS #12 scenarios for on-premises CAs utilising ADCS.
TLS 1.3 is essential for PQC. Microsoft strongly advises customers to abandon older TLS protocols.
These new features will be available to Windows Insiders and development channels for real-world testing. Microsoft can make incremental modifications before release by getting feedback on usability, security, and compatibility. Microsoft will distribute dependable and compatible solutions to supported platforms using a flexible and adaptable approach after standards are finalised. Working with standards organisations and industry partners will ensure features fit global regulatory framework and ecosystem needs.
Future challenges and prospects
Due to their youth, PQC algorithms are an emerging field. This shows how important “Crypto Agility” is in building solutions that can use different algorithms or be modified when standards change.
Microsoft recommends hybrid PQ and crypto-agile solutions for PQC deployment. Composite certificates and TLS hybrid key exchange use PQ and RSA or ECDHE algorithms. Pure PQ implementations should increase as algorithms and standards improve.
Despite integration being a turning point, PQC algorithms' performance, interoperability with current systems, and acceptance remain issues.
Performance: PQC algorithms often require more processing power than standard algorithms. Its efficient implementation without affecting system performance is a big hurdle. Technology for hardware acceleration and optimisation is essential. Keccak is utilised in many PQ algorithms, and hardware acceleration is needed to boost its performance for PQC cryptography.
Larger key encapsulation and digital signatures, especially in hybrid mode, may increase TLS round-trip time. Although signatures cannot be compressed, IETF proposals are examining certificate compression and TLS key sharing prediction. These effects should be assessed on applications and surroundings.
Adoption and Compatibility: PQC requires upgrading and replacing cryptographic infrastructure. Developers, hardware manufacturers, and service providers must collaborate to ensure legacy system compatibility and broad acceptance. Education and awareness campaigns and government-mandated compliance deadlines will boost adoption.
In conclusion
PQC incorporation into Linux and Windows Insiders is a major quantum future preparation step. Microsoft is proactively fixing cryptographic security flaws to help create a digital future that uses quantum computing and reduces security risks. PQC is needed to protect data, communications, and digital infrastructure as quantum computing evolves. Cooperation and security are needed to build stronger systems.
#MLDSAalgorithm#CryptographyAPI#PQCAlgorithms#MLKEM#MLDSA#PQalgorithms#MLKEMalgorithms#technology#technews#technologynews#news#govindhtech
0 notes
Text
CNSA 2.0 Algorithms: OpenSSL 3.5’s Q-Safe Group Selection

The CNSA 2.0 Algorithm
To prioritise quantum-safe cryptographic methods, OpenSSL 3.5 improves TLS 1.3 per NSA CNSA 2.0 recommendations. With these changes, servers and clients can prefer Q-safe algorithms during the TLS handshake.
OpenSSL employs unique configuration methods to do this without modifying TLS. For instance, servers use a delimiter to sort algorithms by security level while clients use a prefix to indicate key sharing.
These changes provide backward compatibility and reduce network round trips to enable a smooth transition to post-quantum cryptography while maintaining the “prefer” criterion for Q-safe algorithms. This version of OpenSSL is the first major TLS library to completely implement CNSA 2.0, and its long-term support makes it likely to be widely deployed.
Q Safe
Quantum-Safe Cryptography and Quantum Computer Danger
The possibility that quantum computers may break asymmetric encryption drives this research.
“Future quantum computers will break the asymmetric cryptographic algorithms widely used online.”
To secure internet communication, quantum-safe (Q-safe) cryptographic methods must be used.
CNSA 2.0's NSA Major Initiator mandate
The NSA's Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) lists authorised quantum-safe algorithms and their implementation timetable. TLS allows ML-KEM (FIPS-203) for key agreements and ML-DSA or SPINCS+ for certificates.
The CNSA 2.0 requirement requires systems to “prefer CNSA 2.0 algorithms” during transition and “accept only CNSA 2.0 algorithms” as products develop. This two-phase method aims for a gradual transition.
The TLS “Preference” Implementation Challenge
TLS (RFC 8446) clients and servers can freely pick post quantum cryptography methods without a preference mechanism. The TLS protocol does not need this decision. The TLS standard allows clients and servers wide freedom in choosing encryption techniques.
A way to set up TLS connections to favour CNSA 2.0 algorithms is urgently needed. One must figure out method to favour Q-safe algorithms without modifying the TLS protocol.
OpenSSL v3.5 Improves Configuration Features
Developers focused on increasing OpenSSL's configuration capabilities since altering the TLS standard was not possible. The goal was to let OpenSSL-using programs like cURL, HAproxy, and Nginx use the new preference choices without modifying their code.
Client-Side Solution: Prefix Characters for Preference
Clients can provide Q-safe algorithms in OpenSSL v3.5 by prefixing the algorithm name with a special character (”) before the algorithm name in the colon-separated list. The ClientHello message asks the client to generate and deliver key shares for ML-KEM-1024 and x25519, showing support for four algorithms.
A client can submit a maximum of four key shares, which can be modified using a build option, to minimise network congestion from Q-safe key shares' increased size. This architecture should allow completely Q-safe, hybrid, legacy, and spare algorithms.
For backward compatibility, the first algorithm in the list receives a single key share if no ‘*’ prefix is supplied.
Server-Side Solution: Preference Hierarchy Algorithm Tuples
The server-side technique overcomes TLS's lack of a native “preference” mechanism by declaring the server's preferred algorithm order using tuples delimited by the ‘/’ character in the colon-separated list of algorithms.
The server can pick algorithms using a three-level priority scheme.
Tuple processing from left to right is most important.
Second priority is client-provided key sharing overlap inside a tuple.
Third, overlap within a tuple using client-supported methods without key sharing.
Example: ML-KEM-768 / X25519MLKEM768
Three tuples are defined by x25519 / SecP256r1MLKEM768. Within each tuple, the server prioritises algorithms from previous tuples, then key share availability, and finally general support.
Even with a vintage algorithm with a readily available key share, this solution ensures that the server favours Q-safe algorithms despite the risk of a HelloRetryRequest (HRR) penalty: The prefer requirement of CNSA 2.0 prioritises Q-safe algorithms, even at the risk of a round-trip penalty that is fully eliminated by the new specification syntax.
Keep Backward Compatibility and Reduce Impact on Current Systems
Designing for backward compatibility was crucial for a smooth transition. The new configuration format doesn't need code changes for existing apps. To avoid disrupting other features, OpenSSL codebase tweaks were carefully made in “a few pinpointed locations” of the huge codebase.
Additional Implementation Considerations
A “?” prefix was added to ignore unknown algorithm names, handle pseudo-algorithm names like “DEFAULT,” and allow the client and server to use the same specification string (requiring the client to ignore server-specific delimiters and the server to ignore client-specific prefixes).
OpenSSL v3.5's Collaboration and Importance
Development involved considerable consultation and collaboration with the OpenSSL maintainer team and other expertise. The paragraph praises the “excellent interactions” throughout development.
OpenSSL v3.5 is “the first TLS library to fully adhere to the CNSA 2.0 mandate to prefer Q-safe algorithms.” Due to its Long-Term Support (LTS) status, Linux distributions are expected to adopt OpenSSL v3.5 more extensively, making these quantum-safe communication capabilities available.
Conclusion
OpenSSL v3.5 must have the Q-safe algorithm preference to safeguard internet communication from quantum computers. The developers satisfied the NSA's CNSA 2.0 criteria by cleverly increasing OpenSSL's configuration features without requiring large code modifications in OpenSSL-reliant applications or TLS standard changes.
Client-side prefix and server-side tuple-based preference systems give quantum-resistant cryptography precedence in a backward-compatible way, enabling a safe digital future. OpenSSL v3.5's LTS status ensures its widespread use, enabling quantum-safe communication on many computers.
FAQs
How is Quantum Safe?
“Quantum safe” security and encryption withstand conventional and quantum computer assaults. It involves developing and implementing cryptography methods that can withstand quantum computing threats.
#technology#technews#govindhtech#news#technologynews#CNSA 2.0 Algorithms#CNSA 2.0#OpenSSL#Q-safe algorithms#Q-safe#OpenSSL v3.5#Quantum Safe
0 notes
Text
IBM Post Quantum Cryptography Mitigates Quantum Risk

Today’s CIOs need to get their companies ready for quantum-secure cryptography
IBM Post Quantum Cryptography
After years of pure study, quantum computers are now becoming practical instruments. They are utilized by organizations and enterprises to investigate the limits of problems in high energy physics, materials development, optimization, sustainability, and healthcare and life sciences. But when quantum computers get bigger, they will also be able to tackle some challenging mathematical issues that are the foundation of modern public key cryptography. Globally utilized asymmetric encryption techniques that now aid in ensuring the confidentiality and integrity of data as well as the authenticity of system access could be broken by a future cryptographically relevant quantum computer (CRQC).
A CRQC carries a wide range of hazards, including the potential for data breaches, disruptions to digital infrastructure, and potentially extensive worldwide manipulation. These quantum computers of the future will be one of the most dangerous threats to the digital economy and present a serious cyberthreat to companies.
Today, the risk is already present. The “harvest now, decrypt later” threat refers to the practice of cybercriminals gathering encrypted data today with the intention of decrypting it later when a CRQC is available. They can obtain illegal access to extremely sensitive data by retrospectively decrypting the data if they have access to a CRQC.
The rescue of post-quantum cryptography
Thankfully, post-quantum cryptography (PQC) techniques have been standardized and are able to secure today’s systems and data. The first three standards were just published by the National Institute of Standards and Technology (NIST):
A key encapsulation technique called ML-KEM is chosen for broad encryption, like that used to access secure websites.
Lattice-based algorithms like ML-DSA are used in general-purpose digital signature systems.
A stateless hash-based digital signature system is called SLH-DSA.
IBM collaborated with outside parties to establish two standards (ML-KEM and ML-DSA), while a scientist who joined IBM co-developed the third (SLH-DSA).
Governments and businesses worldwide will use those algorithms as part of security protocols like “Transport Layer Security” (TLS) and numerous others.
The good news is that we can use these algorithms to guard against the risk of quantum errors. The bad news is that in order to implement these new PQC rules, businesses will need to relocate their properties.
Programs to migrate cryptography algorithms in the past required years to finish. How long did your organization’s SHA1 to SHA2 migration program last? Have you upgraded the PKI trust chain key size from 1024-bit to 2048-bit, 3072-bit, or 4096-bit keys as part of your public key infrastructure (PKI) upgrading program? How long did it take for your intricate corporate environment to implement all of that? A few years?
Quantum computing and the application of post quantum cryptography standards have a wide range of effects on every aspect of your company. Numerous additional systems, security tools and services, apps, and network infrastructure are impacted by the risk of quantum computing. To protect your assets and data, your company must make the switch to PQC standards right away.
Adopt quantum-safe cryptography right now
IBM recommends implementing a quantum-safe transformation procedure to safeguard your company from “harvest now, decrypt later” threats. Use services and begin implementing solutions to enable you to implement the newly released post quantum cryptography encryption requirements.
IBM has created a thorough quantum-safe software approach that is presently being used by dozens of clients in dozens of countries, including national governments, and important businesses.
It suggest that clients implement a program that includes the following crucial stages:
Phase 1: Establish your organization’s priorities and provide your cyber teams with quantum risk awareness to get them ready. Phase 2: Get your company ready for the PQC transfer by transforming it. Phase 3: Implement the PQC migration for your company.
Phase 1: Get your teams ready
Focus on important areas during the first phase of the program, such as developing an organizational-wide awareness campaign to inform security subject matter experts (SMEs) and stakeholders about the quantum risk. Assign “ambassadors” or “champions” who are knowledgeable about quantum risk and its evolution, serve as the program’s main point of contact, and assist in establishing the enterprise strategy.
After that, evaluate the quantum risk to your company’s cryptographically relevant business assets, which include any asset that makes use of or depends on cryptography in general. For instance, among other things, your risk and impact evaluation should evaluate the asset’s economic significance, the complexity of its environment, and the difficulty of migration. Determine the company assets’ weaknesses, along with any necessary remedial measures, and then provide a report outlining the results to important stakeholders so they can comprehend the organizational quantum risk position. This can also be used as a starting point for creating the cryptography inventory for your company.
Phase 2: Get your company ready
In step 2, provide your stakeholders with guidance on how to handle the priority areas that have been identified, as well as any potential quantum threats and cryptographic flaws. Next, describe corrective measures, like pointing out systems that might not be able to handle post quantum cryptography algorithms. Lastly, outline the migration program’s goals.
At this point, IBM assists customers in creating a quantum-safe migration roadmap that outlines the quantum-safe actions necessary for your company to accomplish its goals.
As IBM counsels its clients: Prioritize systems and data for PQC migration and include important projects in your roadmaps, including creating a cryptographic governance structure. Utilize post quantum cryptography in the design and production of Cryptography Bills of Material (CBOMs) by updating your secure software development procedures and guidelines. Collaborate with your vendors to comprehend cryptography artifacts and third-party dependencies. To avoid creating new cryptographic debt or legacy, update your procurement procedures to concentrate on services and solutions that support post quantum cryptography.
“Cryptographic observability,” a cryptographic inventory that enables stakeholders to track the adoption of post quantum cryptography over the course of your quantum-safe journey, is one of the essential necessary capabilities. Data collection, analysis, and risk and compliance posture management should all be automated to enable such an inventory.
Step 3: Execute your migration
Your company implements efforts based on strategic objectives, delivery capacity, risk/cost, priority systems, etc. Throughout phase 3 of the quantum-safe migration program. Create a quantum-safe plan that is upheld by the information security guidelines and rules of your company.
Use standardized, tried-and-true reference architectures, migration patterns, journeys, and blueprints to carry out the technological migration.
Implement cryptographic decoupling by abstracting local cryptography processing to centralized, controlled, and readily adjustable platform services, and incorporate the facilitation of cryptographic agility into the development and migration solutions.
Incorporate a feedback loop with lessons learnt into your software. Permit the development and quick testing of fresh ideas and solutions to help the migration effort in the years to come.
Obstacles to anticipate when transitioning to PQC
Migrating many pieces is difficult. For instance, it will be more difficult to move essential internet infrastructure elements including wide area networks (WANs), local area networks (LANs), VPN concentrators, and site-to-site links. As a result, these components need more care than those that aren’t used often in the company. It is difficult to transfer core cryptography services like PKI, key management systems, secure payment systems, cryptography apps, or backends like mainframes, link encryptors, and HSMs. Dependencies on various hardware and programs, as well as problems with technology interoperability, must be taken into account.
To help guarantee compatibility and performance acceptability and spot any issues, you should also think about performance testing the post quantum cryptography standards against your internal systems and data operations. For instance, PQC occasionally calls for larger key, ciphertext, or signature sizes than are currently employed; this must be taken into consideration during integration and performance testing. Migrating to PQC standards may be challenging or impossible for certain organization-critical systems that still use outdated cryptography. It may be necessary to restructure and refactor the application.
Additional difficulties include a lack of paperwork or expertise, which has led to knowledge gaps in your company. The migration process will be made even more difficult by hardcoded data in systems, configuration files, scripts, etc.
Verify the tracking and management of your digital certificates and encryption keys. The migration will be made more difficult by poor management.
International post quantum cryptography working groups will test some use cases but not others. Your businesses will have a variety of technology configurations and combinations, therefore you must properly evaluate your systems from the standpoint of an end-to-end process.
Avoid waiting for regulations to change
It must expect that regulation outside of the US will happen soon after NIST has published the first set of post quantum cryptography standards. In the financial industry setting, such examples are:
Quantum risks are specifically mentioned in a regulatory technical standard for ICT risk management under the Digital Operations Resilience Act (DORA) in the EU.
It is imperative that “senior management and relevant third-party vendors understand the potential threats of quantum technology,” according to the Monetary Authority of Singapore (MAS). The necessity of “identifying and maintaining an inventory of cryptographic solutions” is also mentioned.
“A current inventory of all cryptographic cipher suites and protocols in use, including purpose and where used,” is now required by a control point in the Payment Card Industry Data Security Standard (PCI DSS) v4.0.1.
As a result, it suggests that you concentrate on creating your cryptographic governance framework, which includes creating a quantum-safe plan for your company. It ought to be in line with your company’s strategic goals, vision, and deadlines. The transformation initiative should include guidance and support from a center of excellence. Key pillars including your organization’s regulatory monitoring, cryptographic assurance and risk management, delivery capacity building, and PQC education should be the focus of the governance structure. It should offer technical design review boards, security architectural patterns, and assistance in implementing best practices within your application development process.
Read more on govindhtech.com
#IBMPost#QuantumCryptography#technology#MitigatesQuantumRisk#cryptography#news#NationalInstituteStandardsTechnology#NIST#postquantumcryptographystandards#technews#govindhtech
0 notes