Attackers don’t need a quantum computer to start winning with quantum. They just need time.

That premise – that adversaries can siphon encrypted data today and hold it until tomorrow’s decryption becomes possible – framed the conversation in “Cybersecurity in Focus: Acting Today Against Tomorrow’s Threat,” a Red Hat Government Symposium session moderated by Mike Epley, chief architect and security strategist for public sector at Red Hat.

Epley was joined by panelists Steve Orrin, chief technology officer and senior principal engineer at Intel, Dustin Moody, a mathematician at the National Institute of Standards and Technology (NIST) who leads its post-quantum cryptography project, and Raymond (Ray) Romano, deputy assistant director and director of cyber threat and investigation at the Department of State. Together, they sketched a near-term reality for government IT leaders: The post-quantum cryptography (PQC) transition is no longer a theoretical future problem. It is quickly becoming a procurement, architecture, and risk prioritization problem that plays out across every environment where cryptography lives – which is nearly all of them.

The quantum clock is ticking

Federal agencies already have a policy runway for PQC. Memo M-23-02 from the Office of Management and Budget was issued to drive governmentwide PQC migration, following National Security Memorandum 10 (NSM-10). Migration begins with visibility and realism, Moody advised.

“Number one, you want to be doing a crypto inventory,” Moody said. “You need to find out where your cryptography is being used, which algorithms specifically are being used, and what information they are protecting. That might sound easy. It’s actually – in practice – a lot harder than that sounds.”

The difficulty comes from scale and sprawl, he noted: “It will be very, very complicated to trace all your systems, all your applications, and see where your cryptography is.”

But the inventory is critical because it informs the next step – deciding which systems need protecting first.

“Where I come in is trying to talk to system owners about what data is going to be sensitive 10 years from now,” Romano said.

“Let’s just say, using an example from the State Department, I’m putting together talking points for a very sensitive negotiation that’s going to happen next week,” he said. “But in a week, when that negotiation happens, I’m going to say those talking points publicly … So if they decrypt it 10 years from now, I’m not really as concerned.”

The risk spikes with long-lived sensitivity. “I have to identify the information that is still going to be sensitive 10 years from now. That’s going to be my highest priority,” Romano said.

Timelines are sharpening the urgency to protect high-value data and systems from adversaries. NSM-10 sets a general deadline of 2035 for the federal government.

For national security systems (NSS), the deadline is much sooner. According to the National Security Agency’s guidance for Commercial National Security Algorithm Suite 2.0, an updated, quantum-resistant cryptographic standard for national security systems, beginning Jan. 1, 2027, all new acquisitions for NSS must be post-quantum ready, Orrin noted.

It’s not a “flip-the-switch” moment

The experts cautioned that PQC migration is not a “swap the algorithm and you’re done” moment, especially given interoperability requirements and legacy stacks.

“It’s going to take time, it’s going to be gradual,” Moody said.

Hybrid approaches – pairing classical and post-quantum algorithms during transition – can help, but they come with compromises. “Implementing two cryptosystems, you’re going to have a performance hit,” Moody said. “It’s also going to be a little bit more complex, and … it opens up a larger attack surface.”

Romano cautioned against waiting for refresh cycles to address PQC migration. “One of the pitfalls … are people that believe, well, it’s 10 years from now … I’ll just wait for the new stuff to already have it embedded,” he said. “We really need to look at the threat landscape … [and] prioritize based on your risk.”

Confidential computing moves to the foreground

As agencies prepare for PQC, they’re also addressing a missing segment in the traditional data protection lifecycle, Orrin said. Data is typically protected at rest and in transit, but not in use, he noted.

“That’s where confidential computing comes to play,” he said. Confidential computing – hardware-based protections that isolate workloads and data while they’re being processed – is a response to cyberattacks on memory, privileged access, and shared infrastructure. It aims to create trusted execution environments that limit exposure.

Confidential computing complements agencies’ zero trust journeys, the panelists noted. While zero trust is about constraining access and continuously verifying it, confidential computing is about shrinking the blast radius when attackers inevitably find a way around controls. Together, they can help agencies reduce the number of places where sensitive information appears in plain text, how long it stays there, and who – or what – can touch it.

Across all cyber mandates and milestones, it’s critical to understand how controls are performing against evolving threats, Orrin noted. The challenge is proving that investments are improving security posture in ways that matter, not just generating more dashboards. Romano emphasized operational metrics that map to real-world resilience, including threat detection coverage – the percentage of the attack surface that is being monitored – and response speed, including mean time to detect, contain, and eradicate.

Watch the Red Hat Government Symposium session: “Cybersecurity in Focus: Acting Today Against Tomorrow’s Threat,” and explore more sessions from the Red Hat Government Symposium.

Read More About
Recent
More Topics