Current state of Quantum Computing Analysis
Current state of Quantum Computing Analysis
Superconducting Qubits: Tech giants like IBM and Google have led advancements in supercond
ucting quantum processors. In 2019, IBM unveiled the IBM Q System One, the first commercial ci
rcuit-based quantum computer (Qubit Contenders - Heidelberg Laureate Foundation ). That sam
e year Google’s 53-qubit Sycamore processor achieved a major milestone, performing a random
circuit sampling task in 200 seconds that was estimated to take 10,000 years on a state-of-the-ar
t classical supercomputer (Qubit Contenders - Heidelberg Laureate Foundation ) – a demonstrati
on of quantum supremacy. Since then, superconducting qubit counts have grown: IBM’s 2021 Ea
gle chip had 127 qubits, and in 2022 IBM debuted Osprey with 433 qubits (Qubit Contenders - H
eidelberg Laureate Foundation ). IBM’s latest Condor chip houses 1,121 qubits, breaking the 1,000
-qubit barrier (Qubit Contenders - Heidelberg Laureate Foundation ). Google recently introduced
its “Willow” processor (~100 qubits), which significantly improved error rates as qubit count scal
ed – a crucial step toward “threshold” error correction. In fact, Willow performed a benchmark co
mputation (random circuit sampling) in under 5 minutes that would take an estimated 10^25 (10 s
eptillion) years on the Frontier supercomputer (Meet Willow, our state-of-the-art quantum chip).
This double-exponential speedup in a controlled experiment underscores the field’s progress in b
oth qubit count and quality. However, superconducting platforms face serious scaling challenge
s: as qubit numbers increase, engineering issues like cryogenic cooling, crosstalk, and calibratio
n complexity grow. IBM’s 1121-qubit Condor, for example, required innovative refrigeration and pa
ckaging to manage increased heat load and wiring (Qubit Contenders - Heidelberg Laureate Fou
ndation ). Moreover, superconducting qubits are susceptible to environmental noise and have fini
te coherence times, so error rates tend to rise with larger systems if not mitigated (Qubit Contend
ers - Heidelberg Laureate Foundation ). Ongoing research therefore emphasizes improving qubit
quality (coherence and gate fidelity) alongside quantity – through better materials, chip architect
ures (e.g. IBM’s “heavy-hexagon” layout reducing crosstalk (IBM Quantum Computers: Evolution,
Performance, and Future Directions) (IBM Quantum Computers: Evolution, Performance, and Fut
ure Directions)), and on-chip error mitigation.
Trapped Ion Qubits: Trapped-ion quantum computers (offered by IonQ, Quantinuum (Honeywel
l), and academic labs) have made steady strides, especially in achieving high-fidelity operations.
Ion qubits (individual atoms trapped by electromagnetic fields) have inherently long coherence ti
mes and identical characteristics, leading to some of the lowest error rates among qubit types. C
urrent devices have on the order of 10–30 ions; for example, IonQ’s latest system “Aria” operates
with 23 high-fidelity qubits (measured as 23 “algorithmic qubits” in IonQ’s benchmark) (IonQ Ari
a is available now, exclusively on Azure Quantum). Though the physical qubit count is lower than
superconducting chips, each trapped-ion qubit can interact with any other (full connectivity) via l
aser pulses, simplifying certain algorithms. In 2023, Quantinuum (a merger of Honeywell’s quant
um division and Cambridge Quantum) demonstrated one of the field’s first fully error-corrected al
gorithms on a trapped-ion system: a fault-tolerant execution of a simple program (one-bit additio
n) using three logically encoded qubits, with an error rate ~1.1×10^−3 – almost an order of magnit
ude better than the ~9.5×10^−3 error rate of the unencoded circuit (Quantinuum’s H1 quantum co
mputer successfully executes a fully fault-tolerant algorithm with three logically-encoded qubit
s). This used Quantinuum’s H1 system (20+ ions in a quantum charge-coupled device architectur
e) and a 3D color code for error correction, leveraging the ions’ low physical error rates (on the o
rder of 10^−4) (Quantinuum’s H1 quantum computer successfully executes a fully fault-tolerant
algorithm with three logically-encoded qubits). Such experiments validate that real-time error co
rrection can dramatically suppress errors, and they showcased entangling gates between logical
qubits in a fault-tolerant manner (Quantinuum’s H1 quantum computer successfully executes a f
ully fault-tolerant algorithm with three logically-encoded qubits) (Quantinuum’s H1 quantum com
puter successfully executes a fully fault-tolerant algorithm with three logically-encoded qubits).
Trapped-ion platforms have also been at the forefront of quantum connectivity experiments – e.
g. shuttling ions between traps and even attempting photonic interconnects – which will be impor
tant for scaling up. The main challenge for ion qubits is speed and parallelism: gate operations ar
e typically slower (microseconds to milliseconds) and involving many ions can be difficult due to
collective motion modes. Scaling to hundreds of ions in one trap is non-trivial, so researchers are
exploring modular architectures (linking many smaller traps via photonic interfaces) to build large
r systems. Despite these hurdles, the exceptional fidelity of trapped ions has allowed them to hit
milestones in quantum logic (such as demonstrating a record high two-qubit gate fidelity over 9
9.9% in experiments) and even exotic physics: notably, in 2023, Quantinuum used its system to c
reate non-Abelian anyons – emergent quasiparticles with topologically protected states – by brai
ding trapped-ion qubits, hinting at the crossover of ion technology and topological quantum com
puting (Qubit Contenders - Heidelberg Laureate Foundation ).
Photonic Qubits: Quantum computers based on photons (light particles) have also seen breakthr
ough experiments. Photonic qubits have the advantage of operating at room temperature (no nee
d for extreme cooling) and very low decoherence (photons barely interact with the environment o
r each other) (Qubit Contenders - Heidelberg Laureate Foundation ) (Qubit Contenders - Heidelb
erg Laureate Foundation ). They can leverage existing nanophotonic fabrication and fiber-optic t
elecom tech for integration (Qubit Contenders - Heidelberg Laureate Foundation ) (Qubit Conten
ders - Heidelberg Laureate Foundation ). The big challenge, however, is that photons’ lack of inte
raction makes it hard to implement two-qubit gates and deterministic operations – photons tend t
o pass through each other unless carefully mediated by nonlinear processes or measurement-ind
uced effects (Qubit Contenders - Heidelberg Laureate Foundation ). Despite this, specialized pho
tonic systems have achieved quantum advantage in computation of specific problems. A notabl
e example is Gaussian boson sampling, a task of sampling from a distribution of many-photon int
erference, which is intractable for classical supercomputers but can be naturally solved by a net
work of optical components. In 2020, a team at USTC (China) introduced Jiuzhang, a photonic q
uantum computer that sampled distributions of 76 photons, solving in a few minutes a problem th
at would take a classical supercomputer on the order of 10,000 years (Qubit Contenders - Heidel
berg Laureate Foundation ). By 2022, USTC’s Jiuzhang 3.0 and Canadian startup Xanadu’s Bore
alis (using time-bin encoded photons) both could solve Gaussian boson sampling tasks in micros
econds, versus many thousands of years classically (Qubit Contenders - Heidelberg Laureate Fo
undation ). These experiments represent a form of quantum supremacy using photonics. Howeve
r, they are limited to specific sampling problems and not yet programmable for general algorithms
(Qubit Contenders - Heidelberg Laureate Foundation ). On the industrial side, PsiQuantum is a pr
ominent player betting on photonic qubits – aiming to build a fault-tolerant, million-qubit photonic
quantum computer using silicon photonics and photonic error correction, though their work is lar
gely theoretical/prototypical at this stage. Xanadu has also released a 24-qubit programmable ph
otonic machine (Xanadu’s Borealis was accessible via cloud), and demonstrated quantum chemi
stry simulations on photonic processors. The road to scalable photonic quantum computing will r
equire overcoming the two-qubit gate problem – approaches include using entangled photon pair
sources, measurement-based quantum computing with cluster states, or optical nonlinearities. If
successful, photonic qubits could be ideal for networking (since they naturally travel through fibe
r) and may integrate with quantum communication systems. For now, photonic quantum compute
rs stand as special-purpose accelerators showing impressive speedups for certain tasks (Qubit
Contenders - Heidelberg Laureate Foundation ), with much development ahead to make them uni
versally programmable.
Topological Qubits: One of the most visionary approaches is topological quantum computing, wh
ich seeks to encode qubits in exotic states of matter that are intrinsically protected from noise. T
he most famous proposal involves Majorana zero modes – quasiparticles that are their own antip
articles – which can emerge in certain superconducting structures. If harnessed, they could form
qubits that store information non-locally and are thus resilient to local errors. For years, Microsoft
has spearheaded R&D in this area, investing in materials science to create Majorana-based qubit
s (through its Station Q labs). This approach had few public breakthroughs until recently. In late 2
022 and 2023, Microsoft researchers reported evidence of Majorana states in topological superc
onductors and progressed toward a “topological qubit.” In February 2025, Microsoft announced
Majorana 1, “the world’s first quantum processing unit powered by a Topological Core,” built on a
new class of materials called a topoconductor (Microsoft unveils Majorana 1, the world’s first qu
antum processor powered by topological qubits - Microsoft Azure Quantum Blog). This device is
designed to scale to a million qubits on a single chip (if the approach works as hoped) (Microsof
t unveils Majorana 1, the world’s first quantum processor powered by topological qubits - Micros
oft Azure Quantum Blog). Alongside this, Microsoft published results in Nature demonstrating a h
ardware-protected topological qubit, showing that they can braid and manipulate Majorana mode
s in a small device (Microsoft unveils Majorana 1, the world’s first quantum processor powered b
y topological qubits - Microsoft Azure Quantum Blog). These claims, if fully validated, mark a tra
nsformative leap – a topological qubit could drastically reduce error rates by its nature, making q
uantum error correction far more efficient. Microsoft also revealed a roadmap to build a fault-tole
rant prototype based on topological qubits within a few years (as part of a DARPA program), rath
er than decades (Microsoft unveils Majorana 1, the world’s first quantum processor powered by t
opological qubits - Microsoft Azure Quantum Blog). While this is cutting-edge research, not yet a
large-scale working computer, it signals that topological qubits are finally leaving the realm of the
ory and entering hardware. Notably, others have pursued topological concepts in different forms:
in 2023, both Google and Quantinuum (using their superconducting and ion platforms) experimen
tally created non-Abelian anyons – particles needed for topological operations – on their quantu
m processors (Qubit Contenders - Heidelberg Laureate Foundation ). This suggests that even wit
hout a dedicated topological hardware platform, elements of topological quantum computing can
be emulated on current machines to test the concepts. If topological qubits mature, they could so
lve many scaling issues since each qubit would be far more stable against noise, but significant c
hallenges remain in fabricating and controlling the complex materials required.
Other Emerging Qubit Platforms: Beyond the above, several other approaches merit mention. Ne
utral atom qubits – using neutral atoms (like rubidium or cesium) trapped in optical tweezers and
excited to Rydberg states – have rapidly advanced. Companies like Pasqal and QuEra and labs lik
e Harvard/Caltech have built 100+ atom systems with reconfigurable 2D atom arrays. These neut
ral-atom processors have achieved high connectivity and recently demonstrated basic error corr
ection and multi-qubit entanglement on par with other platforms (Qubit Contenders - Heidelberg
Laureate Foundation ). Another contender is silicon spin qubits, which leverage the spin of electr
ons or nuclei in silicon transistor-like structures (pursued by Intel, HRL, and academic groups). T
hey promise compatibility with semiconductor manufacturing and have shown long coherence in
isotopically pure silicon. However, controlling many coupled spins at millikelvin temperatures is a
n ongoing engineering puzzle, and current devices have just a few qubits. Nitrogen-vacancy (N
V) centers in diamond – atomic defects in diamond that act as quantum two-level systems – are
being explored more for quantum networking and sensors than large-scale computing, but they o
ffer milliseconds-long coherence at room temperature and have demonstrated small-scale algorit
hms. These and other “exotic” implementations expand the toolkit of quantum hardware. Each ha
s pros and cons, but importantly, cross-platform progress is being made: error-corrected logical
qubits (discussed next) have now been realized in superconducting, ion trap, and neutral-atom s
ystems (Qubit Contenders - Heidelberg Laureate Foundation ), indicating that all these qubit tech
nologies are reaching the stage where they can perform non-trivial algorithms reliably.
Milestones of the Past Decade: The 2010s and early 2020s have seen quantum computing transition
from a lab curiosity to multi-qubit prototypes achieving tasks beyond classical reach. Some key miles
tones include:
2016: First cloud-deployed 5-qubit quantum processors (IBM Quantum Experience) accessible to
the public, spurring global research.
2019: Quantum Supremacy achieved by Google’s 53-qubit Sycamore (random circuit sampling)
(Qubit Contenders - Heidelberg Laureate Foundation ); IBM deploys System One (20 qubits) as a
n integrated quantum system for commercial use (Qubit Contenders - Heidelberg Laureate Foun
dation ).
2020: D-Wave releases Advantage, a 5,000-qubit annealing computer for business applications
(D-Wave Delivers 5000-qubit System; Targets Quantum Advantage); USTC’s Jiuzhang photonic
computer demonstrates boson sampling quantum advantage (Qubit Contenders - Heidelberg La
ureate Foundation ).
2021: IBM’s Eagle 127-qubit chip debuts, surpassing 100-qubit scale; QuEra’s neutral-atom quant
um simulator (256 atoms) solves certain physics problems not tractable classically.
2022: IBM Osprey 433-qubit processor sets a new qubit count record (Qubit Contenders - Heide
lberg Laureate Foundation ); NIST announces first standard PQC algorithms (anticipating future e
ncryption needs – see later) (NIST Announces First Four Quantum-Resistant Cryptographic Algo
rithms | NIST).
2023: IBM’s Condor (1121 qubits) is constructed (Qubit Contenders - Heidelberg Laureate Founda
tion ); Google’s Quantum AI demonstrates for the first time that a larger quantum error-correcting
code (distance-5 surface code) yields lower error rates than a smaller code, a key experimental p
roof of error suppression by scaling (Google claims milestone in quantum error correction • The
Register) (Google claims milestone in quantum error correction • The Register). Quantinuum exe
cutes a fully fault-tolerant algorithm (with 3 logical qubits) on its H1 ion trap, achieving error rates
~10^−3 (Quantinuum’s H1 quantum computer successfully executes a fully fault-tolerant algorith
m with three logically-encoded qubits). USTC builds a 66-qubit superconducting quantum proce
ssor (Zuchongzhi 2) pushing nearer to the supremacy regime for circuits.
2024: Google’s Willow chip reaches threshold error correction, meaning adding qubits reduces o
verall error – a major breakthrough for scalability (Meet Willow, our state-of-the-art quantum chi
p) (Meet Willow, our state-of-the-art quantum chip). Willow’s benchmark showcased a staggerin
g quantum advantage (10^25-year classical task in minutes) (Meet Willow, our state-of-the-art q
uantum chip), widening the gap in “beyond-classical” performance. IBM announces successful
modular linking of multiple chips in their Quantum System Two design (a step toward quantum su
percomputers). Researchers also begin to integrate quantum processors with classical supercom
puters for hybrid computation models.
2025 (outlook): Microsoft’s Majorana-based prototype heralds a possible new era of topologicall
y protected qubits (Microsoft unveils Majorana 1, the world’s first quantum processor powered b
y topological qubits - Microsoft Azure Quantum Blog). The community is now focused on achievi
ng quantum advantage for useful problems (not just contrived benchmarks). While no practical
problem (e.g. chemistry or optimization task) has yet been definitively solved faster by a quantu
m computer than by classical ones (Qubit Contenders - Heidelberg Laureate Foundation ), the st
eady progress across all platforms suggests that moment is drawing closer. Each architecture is
maturing: superconducting and ion qubits are implementing error correction and scaling up, phot
onic and atomic systems are tackling broader programmability, and theoretical breakthroughs (al
gorithms, error-correction methods) continue to guide hardware development.
In summary, the quantum computing field is at an inflection point. Today’s hardware ranges from tens
to a few hundred qubits for gate-based machines (and thousands for specialized annealers), with qu
antum volume and fidelities improving year by year. The major recent breakthroughs – demonstratin
g quantum supremacy (Qubit Contenders - Heidelberg Laureate Foundation ), achieving lower error
logical qubits (Google claims milestone in quantum error correction • The Register), and scaling to 1
000+ qubits in one device (Qubit Contenders - Heidelberg Laureate Foundation ) – have collectively
shown that larger, more reliable quantum computers are feasible. The next big goals are building mac
hines with thousands of error-corrected qubits and finding valuable real-world applications to justify t
his technology’s promise. The following sections delve into how we turn noisy physical qubits into reli
able logical qubits, the implications for cryptography, and the steps being taken to prepare for the qu
antum era.
Bosonic Codes & Cat Qubits: An alternative to using many two-level physical qubits for one logical q
ubit is to use a larger quantum system (like an oscillator mode) to encode the logical qubit within it. T
hese are bosonic codes – they use states of a bosonic mode (e.g. photons in a cavity) to store quant
um information redundantly. A prominent example is the Gottesman-Kitaev-Preskill (GKP) code, whic
h encodes a qubit in specific superpositions of a harmonic oscillator’s position/momentum eigenstate
s. Another approach, implemented in superconducting circuits, is the cat code, which uses superposi
tions of coherent states (like Schrödinger’s cat states) in a microwave resonator. The advantage of bo
sonic/cat qubits is that certain error processes (like photon loss or dephasing) can be directly detecte
d and corrected within the single mode, effectively reducing one type of error dramatically. For instan
ce, a cat qubit can be designed so that bit-flip errors are exponentially suppressed (the 0 and 1 states
are two opposite-phase coherent states that rarely flip into each other), leaving only phase-flip errors
to correct (AWS Describes an Error Correction Experiment Performed with Cat Qubits - Quantum Co
mputing Report). This hardware-efficient QEC means fewer physical resources may be needed: one
high-quality resonator mode plus a few ancilla qubits and gates can act as an error-corrected logical
qubit, as opposed to dozens of physical qubits in a surface code (AWS Describes an Error Correction
Experiment Performed with Cat Qubits - Quantum Computing Report). In 2023, Amazon Web Service
s (AWS) researchers demonstrated an error-correction experiment with cat qubits: they encoded a lo
gical qubit in a cat state within a superconducting cavity, and then used 5 physical qubits and 4 ancill
a (9 qubits total) in a repetition code to correct its remaining errors (AWS Describes an Error Correcti
on Experiment Performed with Cat Qubits - Quantum Computing Report). They implemented both dis
tance-3 and distance-5 codes (similar concept to surface code but one-dimensional) and observed l
ogical error rates slightly improving with the larger code (1.75% vs 1.65%) (AWS Describes an Error C
orrection Experiment Performed with Cat Qubits - Quantum Computing Report). This showed that ca
t qubits can indeed reach the error correction threshold with significantly fewer physical qubits than a
surface code might require for the same performance. The bosonic approach is also pursued by Yale
University and others, which have kept qubits coherent in cavities for up to minutes and demonstrate
d operations between bosonic logical qubits. The drawback is that bosonic codes often need very hig
h fidelity in the analog components (e.g. photon lifetime in the cavity, or the ability to reliably add/rem
ove single photons for error correction), and managing many oscillators with feedback could be com
plex. Nonetheless, bosonic and cat qubits are a promising route to hardware-efficient fault tolerance,
potentially cutting down the overhead per logical qubit by a factor of ~5-10 in the near term (AWS De
scribes an Error Correction Experiment Performed with Cat Qubits - Quantum Computing Report) (A
WS Describes an Error Correction Experiment Performed with Cat Qubits - Quantum Computing Rep
ort). As hardware improves, a hybrid approach might emerge: using bosonic-encoded qubits that are
themselves stabilized by a small surface/repetition code – effectively a concatenated code where the
first layer is bosonic, second layer is standard QEC (Hardware-efficient quantum error correction usi
ng concatenated ...). This is an active area of research aiming to bring error rates from ~1% down to t
he 0.01% or 0.001% range needed for large algorithms.
Other Codes (Color Codes, etc.): Besides surface codes and bosonic codes, there are numerous QE
C codes being explored. Color codes (a class of topological codes in 3D or 2D with certain propertie
s) are interesting because some of them allow transversal implementation of a full gate set (making c
ertain quantum gates easier on logical qubits). Quantinuum’s fault-tolerant one-bit addition experimen
t used a small color code on their ion trap (Quantinuum’s H1 quantum computer successfully execut
es a fully fault-tolerant algorithm with three logically-encoded qubits), which, combined with a trans
versal CCZ gate, significantly reduced the number of operations needed (they cut required two-qubit
gates from >1000 to just 36 for that algorithm) (Quantinuum’s H1 quantum computer successfully ex
ecutes a fully fault-tolerant algorithm with three logically-encoded qubits). This exemplifies how sma
rter codes can improve both reliability and efficiency. There are also quantum LDPC codes under dev
elopment that promise constant overhead per logical qubit (rather than overhead growing with code
distance) – potentially game-changing if their high theoretical thresholds can be reached in hardware.
However, those require long-range connections or complex qubit layouts. Magic-state distillation is a
nother part of error correction: producing low-error “magic” states for non-Clifford gates. This is an o
verhead on top of logical qubits themselves; recent code improvements (like using CCZ gates transve
rsally in color codes as Quantinuum did (Quantinuum’s H1 quantum computer successfully executes
a fully fault-tolerant algorithm with three logically-encoded qubits)) aim to reduce that cost.
In summary, quantum error correction has progressed from theory to experiment in the past few yea
rs. Surface codes have been implemented on superconducting and ion platforms, hitting the break-e
ven point where a logical qubit’s error is lower than the best physical qubit’s error (Google claims mil
estone in quantum error correction • The Register) (Google claims milestone in quantum error corre
ction • The Register). Bosonic and cat codes have shown hardware efficiency, operating below error
thresholds with fewer resources (AWS Describes an Error Correction Experiment Performed with Cat
Qubits - Quantum Computing Report). We now have the first glimpses of fault tolerance in action: sm
all algorithms run on encoded qubits with errors corrected on the fly (Quantinuum’s H1 quantum com
puter successfully executes a fully fault-tolerant algorithm with three logically-encoded qubits). Still,
the overheads remain high – one logical qubit might occupy tens or hundreds of physical qubits even
in near-term demonstrations. For truly scalable quantum computing (dozens or hundreds of logical q
ubits running deep circuits), the community anticipates needing error-corrected machines on the ord
er of 10^4–10^6 physical qubits. A recent estimate, for instance, suggested ~20 million physical qubits
(with reasonable error rates) would be needed to factor a 2048-bit RSA number in about 8 hours usin
g Shor’s algorithm (How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits – Qu
antum) (How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits – Quantum). Thi
s daunting number highlights that we have a way to go – but also that there are no known physics bar
riers to reaching that scale over time. Through a combination of better physical qubits (higher cohere
nce, lower gate error) and better codes, the physical-to-logical qubit ratio will improve. Many researc
hers are optimistic that by the late 2020s or 2030s, we will have demonstrators with on the order of 1
000 physical qubits (yielding a handful of logical qubits) and then scaling up from there. The work ha
ppening now on surface vs. bosonic vs. other codes will determine the most efficient path. If breakthr
oughs like topological qubits materialize, the overhead per logical qubit could drop dramatically (perh
aps a logical qubit per only a few physical qubits if each physical is inherently stable), accelerating th
e timeline to large-scale quantum computers.
rly. The current standard for RSA is 2048-bit keys, and for elliptic curves (like the commonly used se
cp256r1 curve) roughly 256-bit group order. Both are believed infeasible to break with classical comp
uters (taking far longer than the age of the universe). A quantum computer, however, could in principl
e break them in a matter of hours or days if it has enough logical qubits and low error rates to run Sh
or’s algorithm.
The big question is when such a cryptographically relevant quantum computer (sometimes termed C
RQC) will exist. Estimates vary widely. A report by the Global Risk Institute in 2022–2023 surveyed ex
perts and found the median estimate was about 15 years for a quantum computer that can break RS
A-2048 in roughly 24 hours (Quantum Computing Advances in 2024 Put Security In Spotlight). That
would put the timeline around 2037–2040. In other words, many experts believe that by the late 2030
s, we’ll likely have the capability to break commonly used public-key crypto within a day. Some are m
ore optimistic: a few predict that such capabilities could emerge in less than a decade (even by ~203
0), especially with concentrated investment and if unforeseen breakthroughs occur (Hype or Reality:
Will Quantum Computing Break Encryption?). For example, there have been speculative claims that a
billion-dollar quantum computer project could possibly crack RSA-2048 by 2030, though these remai
n highly uncertain and often criticized as optimistic (Hype or Reality: Will Quantum Computing Brea
k Encryption?). On the conservative side, others argue it may take 20+ or even 30 years – one survey
noted some experts think RSA-2048 will remain safe until at least 2040 or beyond if progress is slow
er than anticipated (Quantum Computing Encryption Threats: Why RSA and AES-256 ...). Notably, in
2022 the U.S. National Institute of Standards and Technology (NIST) stated that RSA-2048 “should co
ntinue to offer sufficient protection through at least 2030” (Setting the Record Straight on Quantum C
omputing and RSA ...), implying they don’t expect a quantum breaker in the very near term (but beyo
nd that, all bets are off). It’s worth mentioning that ECC (256-bit) would actually fall even sooner than
RSA-2048 with Shor’s algorithm, since the problem sizes are roughly equivalent in complexity – break
ing a 256-bit elliptic curve key is comparable to a 3072-bit RSA key in classical security, but to a qua
ntum computer, both are polynomial-time tasks of similar scale. Thus, if RSA-2048 is in danger by mi
d/late 2030s, then widely used ECC (like the curves securing most mobile and TLS communications t
oday) is equally in danger, if not more so (ECC keys are shorter, though algorithmic complexity differs
slightly).
From a technical standpoint, what would it take to run Shor’s algorithm on RSA-2048? State-of-the-a
rt research gives some indicative numbers. In 2019, Craig Gidney (Google) and Martin Ekera (KTH) pu
blished an optimized methodology for factoring 2048-bit RSA on a quantum computer. They estimate
d that using their approach, one could factor a 2048-bit number in about 8 hours on a quantum com
puter with 20 million noisy physical qubits (assuming a physical error rate around 10^−3 and surface
-code error correction) (How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits
– Quantum) (How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits – Quantu
m). This corresponds to on the order of a few thousand logical qubits – their idealized algorithm need
ed about 6,000 logical qubits (3n qubits for the main routine, plus some overhead) for an 𝑛=2048-bit
number (How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits – Quantum). Th
e 20 million figure comes from the overhead of error correction to make those logical qubits reliable f
or the billions of gate operations required. Earlier estimates prior to their work were much higher (hun
dreds of millions or more physical qubits and longer run-times), so 20 million was already a hundredf
old cost reduction over prior projections (How to factor 2048 bit RSA integers in 8 hours using 20 mi
llion noisy qubits – Quantum). Still, 20 million qubits is far beyond the few thousand qubits available t
oday, so we are not close yet. Another recent analysis by Fujitsu researchers (2023) suggested that a
fault-tolerant quantum computer with about 10,000 logical qubits could break RSA-2048 in on the or
der of a few months (104 days) (Quantum Computers Could Crack Encryption Sooner Than ...) – agai
n, consistent with needing thousands of perfect qubits or many millions of physical ones.
In practice, achieving this requires not only many qubits but also long circuit depth: Shor’s algorithm f
or 2048-bit RSA might need on the order of 10 quantum gate operations. Those gates must be exe
12
cuted faster than errors accumulate, hence the need for error-corrected qubits. If one has, say, a milli
on physical qubits with error correction, the total operation time might be on the order of hours to day
s, which is feasible if clock speed is in the MHz range for logical operations. Gidney & Ekera’s scenari
o basically posits a machine that can execute ~10^14 logical operations in 8 hours (How to factor 204
8 bit RSA integers in 8 hours using 20 million noisy qubits – Quantum). These numbers highlight why
breaking RSA is a medium-to-long term prospect: we need much larger and more reliable machines t
han exist now.
The implications of a successful quantum attack on RSA/ECC are enormous. Virtually all secure Inter
net communications (TLS/SSL), digital signatures, cryptocurrencies, and secure email/transactions re
ly on these algorithms. A quantum computer that can break them would render current public-key infr
astructure obsolete – past messages encrypted with RSA/ECC could be decrypted, and forged digital
signatures would become possible, undermining trust in software updates, financial transactions, etc.
This looming threat has prompted warnings from security agencies. The U.S. National Security Agenc
y (NSA) in 2015 announced plans to transition to quantum-resistant cryptography, and quietly advise
d phasing out legacy ECC (Suite B) algorithms. Intelligence agencies are also concerned about the “h
arvest now, decrypt later” strategy: an adversary can record sensitive encrypted communications no
w (for example, diplomatic or military traffic) and store them, with the expectation that in 10-15 years
a quantum computer could decrypt those if they haven’t been post-quantum protected. Indeed, cyber
security experts note there are likely active efforts by nation-states to intercept and save encrypted d
ata today, anticipating future decryption when quantum capabilities arrive (Quantum Computing Adv
ances in 2024 Put Security In Spotlight). This prospect has significantly accelerated efforts in the se
curity community to migrate to quantum-safe encryption (discussed in the next section).
It’s worth noting a few real-world developments: So far, Shor’s algorithm has been demonstrated on
only very small numbers (like factoring 15 or 21 using a handful of qubits) – these were laboratory pro
ofs of concept. In late 2022, there was a brief stir when a group of academic researchers (in China) c
laimed to have used a quantum-derived algorithm to factor a 48-bit number and speculated about sc
aling to RSA-2048 with only 372 qubits. However, experts quickly debunked this as a misinterpretatio
n – they were essentially using a variant of classical Schnorr’s algorithm with a quantum assist, which
does not generalize to breaking RSA-2048 efficiently. The consensus remains that full Shor’s algorith
m on large keys is needed to break RSA/ECC, and thus a large fault-tolerant quantum computer is req
uired. As of 2025, no such machine exists, and even the largest experimental devices (with 100+ nois
y qubits) are roughly 10–15 logical qubits short of running even a toy version of Shor’s on, say, a 8-bit
(255) number.
For AES-128 (128-bit key), Grover’s would reduce the effective security to about 2^64 operations. 2^6
4 (roughly 1.8 × 10^19) operations is borderline – classical computing cannot do 2^128, but 2^64 is po
tentially within reach of a powerful quantum computer, though still extremely high (for comparison, 2^
64 operations at 1 billion ops per second would take ~584 years; a quantum computer could in theory
massively parallelize Grover or run at higher gate rates, but it gives a sense of scale). Thus, AES-128 i
s considered not safe in a post-quantum world (Bluefin: Don’t forget about AES-256 in the fight agai
nst quantum threats - Inside Quantum Technology). AES-256, on the other hand, would drop to an ef
fective 2^128 security level under Grover’s algorithm, which is astronomically large – on par with the o
riginal classical security of AES-128. Because 2^128 operations is still completely impractical, AES-25
6 is widely believed to be quantum-resistant in the sense that no feasible quantum attack would und
ermine it within relevant timeframes (Bluefin: Don’t forget about AES-256 in the fight against quantu
m threats - Inside Quantum Technology). As one expert succinctly put it: “According to Grover’s algor
ithm, a brute-force attack time can be reduced to its square root. But if this time is still sufficiently lar
ge, it becomes impractical... For AES-128 this is 2^64 (not safe enough), but AES-256 is 2^128... Henc
e, it is considered post-quantum computing resistant.” (Bluefin: Don’t forget about AES-256 in the fig
ht against quantum threats - Inside Quantum Technology).
We can quantify what “impractical” means here. A 2019 research paper from Kryptera estimated that
to brute-force AES-256 with Grover’s algorithm, a quantum computer would need over 6,600 logical
qubits and an enormous number of operations, likely requiring a machine with millions of physical qu
bits (Bluefin: Don’t forget about AES-256 in the fight against quantum threats - Inside Quantum Tech
nology). In effect, the resource requirements to even match the security of AES-256 are so high that
simply using AES-256 (instead of AES-128) is a very effective hedge against quantum attacks. Anoth
er source estimated that to crack AES-256 in a realistic timeframe, on the order of 2 quantum step
128
s, one might need something like 2.97 × 10^8 (297 million) qubits if executed naively without significa
nt parallel speedups (Quantum Computing Encryption Threats: Why RSA and AES-256 ...) – a numbe
r far beyond any foreseeable quantum computer. Even with more clever circuit optimization, it’s clear
that AES-256 provides a comfortable security margin. For this reason, NIST and cryptographers view
AES-256 as secure in the post-quantum era, and in fact recommend moving to 256-bit symmetric ke
ys (and 384-bit for SHA-2 hashes) as a precaution. Many industry experts remind us not to forget the
strong protection symmetric algorithms offer – for instance, one article emphasizes that AES-256, wit
h its huge key space, is a solid line of defense against quantum attacks, and enterprises should adopt
it if not already using it (Bluefin: Don’t forget about AES-256 in the fight against quantum threats - In
side Quantum Technology) (Bluefin: Don’t forget about AES-256 in the fight against quantum threats
- Inside Quantum Technology).
That said, Grover’s algorithm could still pose a threat in certain contexts. If an attacker had a quantum
computer with, say, a few thousand logical qubits, they might target shorter symmetric keys (some le
gacy encryption, older protocols, or less-common cryptos use 64- or 80-bit keys, which would be tri
vial under Grover). Also, Grover’s algorithm can attack hashing (for example, finding a preimage in a h
ash function in √N steps, which is why hash-based signatures double output length for safety). But fo
r the mainstream AES and SHA-256, using 256-bit keys and outputs is sufficient to parry Grover’s atta
ck.
The term “Q-day” is sometimes used to denote the day when quantum computers can break existing
crypto. It is often predicted as 10-20 years out, but because data can be harvested now and broken l
ater, effectively anything sensitive that needs confidentiality beyond the 2030s should assume Q-day
is coming. Intelligence and security communities are almost certainly monitoring quantum computin
g progress closely. It’s widely believed that whoever first develops a CRQC would likely keep it secret
(to exploit it) rather than announce it – which is an extra motivation for defenders to switch to quantu
m-safe methods well before such a computer is built. The mere possibility that powerful adversaries
might get a quantum decrypting tool has been compared to a “cryptographic apocalypse” scenario fo
r outdated encryption.
In practical terms, many organizations (financial institutions, cloud providers, etc.) are already testing
hybrid encryption models: for example, TLS handshakes that use both an RSA/ECDH key exchange a
nd a post-quantum key exchange in parallel – so that even if RSA/ECDH is broken later, the session k
ey is still protected by the PQC algorithm. Akamai’s security team notes they are doing exactly this: c
ombining classical and post-quantum key exchange, because the new algorithms are still young and i
t’s wise to have a classical backup – “If we do a hybrid…and if one fails, at least we’ll have the other o
ne” (NIST Releases 3 Post-Quantum Standards, Urges Orgs to Start PQC Journey). This belt-and-su
spenders approach is likely to be a common strategy in the transition period.
In summary, breaking modern encryption via Shor’s algorithm is one of the killer applications (or dan
gers) of quantum computing, but it remains a future prospect – experts say likely 10-20 years away fo
r strong 2048-bit keys (Quantum Computing Advances in 2024 Put Security In Spotlight). However, t
he mere finite horizon of that threat has already set in motion a comprehensive effort to switch to qua
ntum-resistant cryptography. Meanwhile, symmetric ciphers like AES-256 are expected to hold up a
gainst quantum attacks (with an increased key length if needed) (Bluefin: Don’t forget about AES-256
in the fight against quantum threats - Inside Quantum Technology), so the focus is primarily on repla
cing public-key algorithms. The next section delves into these replacement algorithms and how the w
orld is preparing for the post-quantum cryptography era.
FALCON: Another lattice-based digital signature (based on NTRU lattices and using Fourier samp
ling). FALCON produces much smaller signatures (~few hundred bytes) than Dilithium but is more
complex (uses floating point arithmetic in implementation). NIST recommends FALCON as a seco
ndary algorithm for applications that need smaller signatures or where Dilithium’s larger signature
size is an issue (NIST Announces First Four Quantum-Resistant Cryptographic Algorithms | NIS
T). FALCON’s draft standard is expected as FIPS 206 (NIST Releases 3 Post-Quantum Standards,
Urges Orgs to Start PQC Journey).
SPHINCS+: A hash-based signature scheme. It is stateless (doesn’t require keeping track of one
-time keys like earlier hash-based sigs) and relies only on the security of hash functions (usually
SHA-256). SPHINCS+ has the advantage of very conservative security assumptions (hash functi
ons are well studied), but its signatures are large (~ tens of kilobytes) and signing is slower. NIST
chose it as an alternative digital signature (FIPS 205, SLH-DSA) for those who desire diversity in
cryptographic assumptions (NIST Releases 3 Post-Quantum Standards, Urges Orgs to Start PQC
Journey). Hash-based signatures are seen as a reliable fallback since even if all number-theoreti
c assumptions were broken, hashes could remain secure.
These four algorithms were the “winners” of the first phase. In 2024, NIST released the final standard
s for Kyber, Dilithium, and SPHINCS+ (and plans to finalize FALCON soon) (NIST Releases 3 Post-Qu
antum Standards, Urges Orgs to Start PQC Journey) (NIST Releases 3 Post-Quantum Standards, Urg
es Orgs to Start PQC Journey). Meanwhile, NIST launched a 4th round to consider some additional c
andidates (like code-based KEMs such as Classic McEliece, and certain multivariate and isogeny sch
emes) for potential inclusion as backups or specialized use-cases (NIST Announces First Four Quant
um-Resistant Cryptographic Algorithms | NIST) (NIST Releases 3 Post-Quantum Standards, Urges O
rgs to Start PQC Journey). Notably, a previous alternate candidate, SIKE (an isogeny-based encryptio
n scheme), was broken by classical cryptanalysis in 2022, which underscores the importance of the
thorough vetting process – it was fortunate that SIKE was not in the final four. NIST’s selections emph
asize confidence and performance: Kyber and Dilithium (both developed with contributions from IB
M, Google and others) are expected to be the workhorses of post-quantum encryption and signing (N
IST Unveils 3 PQC Algorithms Ready for Immediate Use - MeriTalk) (NIST Announces First Four Qua
ntum-Resistant Cryptographic Algorithms | NIST).
In summary, the industry now has concrete standards for quantum-resistant crypto. As one news hea
dline put it, this is “the first major milestone for cryptography since the adoption of AES in 2001” (NIS
T Releases 3 Post-Quantum Standards, Urges Orgs to Start PQC Journey). The algorithms are ready
for implementation: for example, Kyber is being integrated into TLS libraries, and Dilithium certificates
are being tested in browsers.
Web and Internet Protocols: All major browser vendors and tech companies have run experiment
s with PQC in TLS. In 2022, Cloudflare and Google conducted large-scale trials of hybrid TLS key
exchanges using Kyber (at that time combined with X25519 ECDH) to assess performance and c
ompatibility. The data showed PQC algorithms can be integrated without breaking the internet, th
ough there were occasional issues with devices that had outdated implementations (some old TL
S libraries couldn’t handle the larger key sizes, for instance). Today, companies like Cloudflare, A
WS, Microsoft, and Google have enabled options for PQC in their cloud services. Akamai has dep
loyed hybrid key exchanges for some of its customers already (NIST Releases 3 Post-Quantum S
tandards, Urges Orgs to Start PQC Journey) (NIST Releases 3 Post-Quantum Standards, Urges
Orgs to Start PQC Journey). One approach to ease adoption is the hybrid mode mentioned earlie
r: by doing classical+PQC together, one gains confidence in PQC while still relying on classical se
curity as a backstop during the transition period.
VPNs and Secure Networks: Vendors of VPN products (Cisco, etc.) and government networks h
ave begun adding support for PQC algorithms (like using Kyber instead of RSA for key exchange i
n IPSec/IKE or TLS VPNs). The U.S. National Security Systems (NSA Suite) has issued requireme
nts that certain high-security communications start moving to approved interim quantum-resistan
t solutions (even prior to the final NIST standards, they have interim “Commercial National Securit
y Algorithm Suite 2.0” which includes things like X25519+Kyber hybrid).
Cryptographic libraries: Open-source crypto libraries (OpenSSL, BoringSSL, libsodium, etc.) hav
e added implementations of the NIST PQC algorithms. OpenSSL 3.0+ has module support where
you can plug in PQC providers. Microsoft’s SIKE (pre-break) and now post-break, other algorithm
s, have been in their SChannel library for testing. In August 2023, OpenSSL released an alpha tha
t included Kyber and Dilithium, signaling that mainstream adoption is close.
Hardware and IoT: A big challenge is ensuring that low-power and embedded devices can also b
e upgraded. Some PQC schemes have large public keys – e.g. Classic McEliece (still under consi
deration) has a public key of a megabyte, which is not feasible for smart cards. Fortunately, the c
hosen algorithms have moderate sizes (Kyber’s ~800-byte key and ~700-byte ciphertext, Dilithiu
m’s few-kB signature). Many IoT devices can handle those, but older hardware (like say, a constr
ained microcontroller that comfortably handled 256-bit ECC but might struggle with a 3kB signat
ure) will need optimization. Companies such as IBM have demonstrated hardware acceleration fo
r lattice ops, and there’s work on making sure PQC can run on smart cards and TPMs. We’re likel
y to see new versions of security chips that accelerate lattice-based math (much like current one
s accelerate RSA/ECC) to ease this transition.
Cryptographic Agility: A lesson from this process is the importance of crypto agility – designing
systems that can swap out crypto algorithms without extensive rewrites. Protocols like TLS 1.3 ar
e fairly agile (new cipher suites can be added), whereas some older protocols baked in specific a
lgorithms. Industry standards bodies (IETF, etc.) are updating protocols to allow PQC algorithms
(for instance, there are drafts for PQC in X.509 certificates, in S/MIME email, etc.). One concrete
example: a new X.509 extension might allow a certificate to have both an ECC signature and a Dil
ithium signature for compatibility. This way older clients verify the ECC part, newer ones verify th
e PQC part.
The readiness for transition varies by sector. Big Internet companies are moving fastest, often workin
g alongside NIST’s timeline. Finance and government are keenly aware – banks have customer data t
hat needs decades of security, so many are already experimenting with quantum-safe VPNs between
data centers. Cloud providers (AWS, Azure, Google Cloud) have all announced preview services for q
uantum-safe cryptography – for instance, AWS KMS offers PQC key options, Azure Quantum has inte
gration for PQC in some offerings, and Google Cloud has a service to help customers implement hybr
id encryption.
Testing and confidence: These PQC algorithms are new. While they’ve been through years of an
alysis during the NIST process, they haven’t seen decades of real-world attacker scrutiny like RS
A/ECC have. There is some reluctance to fully trust them until they’ve matured. This is why hybri
d approaches are popular initially. Over time, as the algorithms prove themselves and more crypt
analysis is done (and as urgency increases with advancing quantum tech), confidence will solidif
y.
Interoperability: In the near term, not everyone will upgrade at once. This means, for example, a
web server might support both classical and PQC algorithms, and negotiation must choose the b
est common algorithm. During the transition, we might see “algorithm mismatch” issues if, say, a
n email from a PQC-enabled system goes to a legacy system. Careful planning and broad compa
tibility testing are needed to avoid communication failures.
Legacy systems: Perhaps the hardest part is upgrading systems that are not easy to update – old
hardware in industrial control systems, embedded devices in medical equipment, etc. Many such
systems might not get a firmware update and could become vulnerable if not replaced. Organizat
ions will have to inventory where they use vulnerable cryptography and decide whether to upgra
de or retire those systems. Governments are now requiring this inventory as a first step (NIST Rel
eases 3 Post-Quantum Standards, Urges Orgs to Start PQC Journey).
Human and organizational factors: Companies need to educate their workforce about PQC, upd
ate internal standards, get new certificates issued (certificate authorities will need to begin issuin
g PQC-based certificates), and so on. It’s a significant effort akin to the Y2K preparation or the mi
gration from SHA-1 to SHA-2, but arguably larger in scope because it affects fundamental securit
y of everything.
Overall, the world is gearing up for PQC. In 2024, NIST’s final standards gave the green light for adop
tion (NIST Releases 3 Post-Quantum Standards, Urges Orgs to Start PQC Journey) (NIST Releases 3
Post-Quantum Standards, Urges Orgs to Start PQC Journey). The NSA in the U.S. has set timelines f
or national security systems to transition by 2035. The EU has its own project (EU PQC and migration
within the Eurozone). We are entering the period where “crypto agility” and “post-quantum readines
s” are becoming standard terms in cybersecurity planning. The good news is that the algorithms sele
cted are efficient enough that, for most users, the switch will be behind the scenes and largely unnoti
ceable (a web browser using TLS with Kyber/Dilithium might add only minimal latency or bandwidth o
verhead that users won’t detect). The urgent message from experts is to start migrating sooner rathe
r than later – “Companies should be starting to get concerned about a usable quantum computer no
w,” one expert advised, not because it exists yet but because data being encrypted today could be ex
posed in the future (Quantum Computing Advances in 2024 Put Security In Spotlight). By adopting P
QC in the next few years, organizations can ensure their data and communications remain secure wel
l into the era of quantum computing.
IBM: IBM has been at the forefront of quantum hardware development, focusing on superconduc
ting qubits. In 2016, IBM put the first 5-qubit device on the cloud and has since steadily increased
qubit counts while improving coherence. IBM’s quantum processors have evolved from the 5 and
16-qubit “Falcon” family to the 27-qubit “Falcon r5”, then the 65-qubit “Hummingbird”, 127-qubi
t “Eagle” (2021), and the 433-qubit “Osprey” (2022) (Qubit Contenders - Heidelberg Laureate F
oundation ). IBM’s 1,121-qubit “Condor” chip was planned for 2023 and represents a major scalin
g milestone (Qubit Contenders - Heidelberg Laureate Foundation ). Alongside hardware, IBM intr
oduced the concept of Quantum Volume (a holistic performance metric) and has demonstrated s
teady progress on it. IBM’s roadmap, updated in 2023, outlines a vision of quantum-centric supe
rcomputers: by 2025–2026 they aim to connect multiple chips into a single system (IBM Quantu
m System Two) to reach effective qubit counts in the tens of thousands, and beyond that, they fo
resee modular and networked quantum processors scaling to a million qubits by the 2030s (Micr
osoft unveils Majorana 1, the world’s first quantum processor powered by topological qubits - Mi
crosoft Azure Quantum Blog) (Microsoft unveils Majorana 1, the world’s first quantum processor
powered by topological qubits - Microsoft Azure Quantum Blog). IBM is also a leader in software
with its Qiskit framework and has a large community of users. Their research teams have achieve
d records in quantum error mitigation and recently demonstrated a 127-qubit entanglement (all q
ubits entangled) on the Eagle processor. IBM’s collaborations with national labs and universities h
ave pushed applications in chemistry and optimization on their hardware. In short, IBM is driving
an integrated approach: improving hardware quality, architecture (e.g. heavy-hex lattices, 3D pac
kaging), and software to reach what they term Quantum Advantage (useful tasks better than clas
sical) on the way to fault tolerance. IBM’s contributions are evidenced by them holding the curren
t record for highest qubit count in a gate-model quantum processor (1121) and continuously publi
shing innovations in devices and materials (e.g. new transmon designs, faster cryo-electronics, e
tc.).
Google: Google’s Quantum AI division (based in Santa Barbara) made headlines with the quantu
m supremacy experiment in 2019 (Qubit Contenders - Heidelberg Laureate Foundation ), but the
y didn’t stop there. Google’s strategy has been twofold: demonstrate landmark achievements (su
premacy, error correction milestones) and build towards a full-scale fault-tolerant machine within
a decade. After Sycamore (53 qubits, coupled superconducting qubits), Google briefly explored a
larger “Bristlecone” chip (72 qubits) but then pivoted to focus on quality over quantity – improvin
g gate fidelities and crosstalk. In 2021–2022, they published experiments on Quantum Error Corre
ction (using surface codes on 21 qubits) and in 2023 reached the breakthrough of suppressing er
ror by increasing code size (Google claims milestone in quantum error correction • The Registe
r). Their latest hardware, the Willow processor, around 100+ qubits, has shown that adding qubit
s can exponentially suppress certain errors (Meet Willow, our state-of-the-art quantum chip). Wi
llow’s demonstration of a 5-minute computation that would take 10^25 years classically (Meet Wil
low, our state-of-the-art quantum chip) is a strong validation of their hardware advances. Google
is also heavily investing in a new quantum fabrication facility (they built a dedicated fab in Santa
Barbara) to manufacture qubits with higher uniformity and yield (Meet Willow, our state-of-the-ar
t quantum chip). Their roadmap (announced by Hartmut Neven and team) is to have a prototype
1,000 logical-qubit quantum computer by the end of the decade – which might correspond to ~1
million physical qubits with error correction. Google has been exploring alternative qubit modaliti
es as well (they’ve published on CMOS quantum dots and partnered on photonic qubit projects),
but their main focus remains superconducting circuits. On applications, Google has active resear
ch in quantum chemistry (they simulated the energy of a chemical compound using Sycamore wi
th an algorithm plus error mitigation) and optimization/AI (investigating how quantum sampling mi
ght help machine learning). A distinguishing feature of Google’s approach is its penchant for bold
engineering goals (they coined “quantum supremacy” for a specific milestone and achieved it, n
ow they talk of “building a useful error-corrected quantum computer” as the next big goal). Given
Google’s computational might, they also contribute to improving classical simulation of quantum
circuits – often trying to challenge their own devices to ensure claims of quantum advantage are
on solid ground (Meet Willow, our state-of-the-art quantum chip). In essence, Google remains o
ne of the top players, expected to be among the first to reach true fault-tolerant computing if thei
r current trajectory continues.
Rigetti Computing: Rigetti is a Silicon Valley startup (founded 2013) that built one of the first priv
ate quantum labs. They also use superconducting qubits and were early in offering cloud access
to their quantum processors via their Forest platform. Rigetti’s distinguishing effort was on multic
hip quantum processors – instead of one large die, they developed a modular approach where c
hips are connected via tunable couplers. In 2021, Rigetti unveiled the world’s first commercial mul
tichip quantum processor, the 80-qubit Aspen-M, which is essentially two 40-qubit chips connec
ted together (Rigetti Computing Introduces Scalable Multi-chip Quantum Processor) (Introducin
g the Ankaa™-1 System — Rigetti's Most Sophisticated ...). This modular approach is aimed at sc
aling qubit count while keeping fabrication yields high (smaller chips are easier to produce). Riget
ti has also pursued novel two-qubit gate schemes and has worked on improving qubit coherence
using better materials (they implemented proprietary Fab designs, though they rely on foundries f
or manufacturing). In terms of performance, Rigetti’s current 40-qubit devices have lower fidelity
than IBM/Google’s (two-qubit gate errors on the order of 1-2%, coherence times ~20 µs reporte
d), so they have faced challenges in achieving quantum advantage. The company underwent so
me restructuring in 2022–2023, but continues R&D. They recently announced a new architecture
called Ankaa with 84 qubits on a single chip and improved gate fidelities, and a plan for a 336-qu
bit system using four chips linked (each Ankaa chip 84 qubits) (Introducing the Ankaa™-1 System
— Rigetti's Most Sophisticated ...). Rigetti’s long-term vision is a scalable quantum cloud service
where algorithms can tap into increasing qubit counts seamlessly through their API. While they h
ave not hit the high notes of IBM or Google, Rigetti has contributed significantly, especially in the
idea of modular quantum computing, which even the bigger players are now adopting. Rigetti’s ef
forts also underscore the difficulty for startups in this space – requiring heavy investment in hard
ware with competition from tech giants. Nonetheless, their innovation in chip connectivity and th
eir presence in the ecosystem (they work with DARPA and DOE on some projects) make them an
important player to watch.
IonQ: IonQ (founded in 2015 as a spin-off from University of Maryland and Duke) is a leader in tra
pped-ion quantum computers. IonQ’s systems have fewer qubits than superconducting ones, but
each qubit is of very high quality. IonQ has reported among the highest quantum gate fidelities in
the industry (99.9% two-qubit gate fidelity in some instances). Instead of counting raw qubits, Io
nQ often uses “algorithmic qubits” (#AQ) as a metric, reflecting how many qubits can be effectiv
ely used in a computation before error rates make the result unreliable. In 2022, IonQ’s Aria syste
m reached 23 algorithmic qubits, which is currently a record in that metric (IonQ Announces Sec
ond Quarter 2022 Financial Results) (IonQ Aria is available now, exclusively on Azure Quantum).
Physically, Aria has about 32 ion qubits. IonQ has a roadmap projecting #AQ 29 in 2023, #AQ 35
in 2024, and #AQ 64 by 2025, implying they hope to double the useful computational power ever
y year or so. IonQ’s architecture allows all-to-all connectivity, which means even with moderate q
ubit counts, they can implement circuits that might be very deep or complex on a less connected
architecture. They have demonstrated small algorithms in quantum chemistry and machine learni
ng. IonQ also went public (via SPAC) in 2021, garnering significant capital to accelerate developm
ent. They are working on new generations of ion trap designs, including Evans (an 87-qubit syste
m in development) and exploring photonic coupling of ion traps to scale beyond one chain of ion
s. IonQ and Quantinuum are in a friendly rivalry as the top trapped-ion companies. IonQ’s approa
ch has been very application and software-driven as well – their teams published results on varia
tional algorithms, and they partner with software startups to make use of their machines. The str
ength of IonQ is clearly the high fidelity and long coherence of their qubits, which makes them st
rong candidates for near-term quantum advantage in specialized tasks. The weakness is slower
gates and the difficulty of handling too many ions in one trap (they currently use a single linear c
hain; moving beyond ~50 ions might require new architectures). IonQ’s vision for 2028+ is to hav
e modular quantum computers that could potentially reach hundreds of qubits by linking multiple
chains with photonic interconnects. In the nearer term, IonQ is already providing access to their s
ystems via all major cloud providers (AWS, Azure, Google Cloud) – making them quite influential i
n giving researchers hands-on experience with a different qubit modality.
Quantinuum (Honeywell): Although not listed explicitly in the prompt, it’s worth mentioning Quan
tinuum as it is a major player born from Honeywell Quantum Solutions and Cambridge Quantum’s
merger. Quantinuum’s H1 ion-trap devices have set records in low error rates (single-qubit error
<0.1%, two-qubit ~0.2%) and have achieved notable experiments like a fully fault-tolerant quant
um algorithm execution (Quantinuum’s H1 quantum computer successfully executes a fully fault
-tolerant algorithm with three logically-encoded qubits). In 2023, Quantinuum launched the H2 s
econd-generation trap, with 32 qubits and all-to-all connectivity using an advanced “racetrack” t
rap design (Quantinuum Launches H2, Reports Breakthrough in Work on ...). They also reported
creating Non-Abelian topological matter and anyons on H2 – a physics breakthrough showing th
e versatility of their machine (For the First Time Ever, Quantinuum's New H2 Quantum Computer
...). Quantinuum has been a pioneer in quantum cybersecurity too (they released a quantum rand
om number generator product and are involved in PQC via their Cambridge roots). With strong ba
cking from Honeywell and a large team, Quantinuum plays a leading role especially in demonstrat
ing how quantum error correction can be done in trapped ions (their experiments with color code
s and real-time correction are among the best to date (Quantinuum’s H1 quantum computer succ
essfully executes a fully fault-tolerant algorithm with three logically-encoded qubits)). We can e
xpect Quantinuum to continue pushing towards larger, interconnected ion traps and exploring diff
erent coding techniques (they tout the flexibility of their QCCD (Quantum CCD) architecture in ru
nning various codes, not just surface codes (Quantinuum’s H1 quantum computer successfully e
xecutes a fully fault-tolerant algorithm with three logically-encoded qubits)).
Microsoft: Microsoft’s approach to quantum computing has been unique. For years they pursued
the topological Majorana qubit (as discussed earlier) which took longer than hoped to come to fr
uition. In the meantime, Microsoft built out Azure Quantum, a cloud platform that offers access to
other companies’ quantum hardware (IonQ, Quantinuum, and Rigetti all have or had their machine
s available through Azure). Microsoft also developed a full software stack, including the Q# progr
amming language and tools for quantum development. Now, with the announcement of Majorana
1 in 2025, Microsoft is reasserting itself on the hardware front with what could be a game-change
r if it works as advertised (Microsoft unveils Majorana 1, the world’s first quantum processor pow
ered by topological qubits - Microsoft Azure Quantum Blog). They claim to be on track to a fault-
tolerant prototype within a few years using topological qubits (Microsoft unveils Majorana 1, the
world’s first quantum processor powered by topological qubits - Microsoft Azure Quantum Blo
g). Microsoft’s contributions thus far have been more on the theoretical and software side (quant
um algorithms research, error-correcting code theory, etc.), but if their topological qubit succeed
s, they may leapfrog in hardware. They are also part of major research collaborations, e.g. with u
niversity labs (University of Sydney for cryogenic controls, Delft University for materials). In 202
3, Microsoft researchers also demonstrated a 24-qubit neutral atom array (through a partnership
with startup Atom Computing), achieving some record coherence times (Quantum Computing A
dvances in 2024 Put Security In Spotlight). This was announced on Azure as well (Quantum Co
mputing Advances in 2024 Put Security In Spotlight). It indicates Microsoft is hedging by also ex
ploring other qubit tech like neutral atoms (Atom Computing’s 24-qubit system, while small, had
>40s coherence on certain states (Quantum Computing Advances in 2024 Put Security In Spotli
ght)). Overall, Microsoft is gearing up on many fronts: if they can build on the Majorana breakthro
ugh, their vision of a million-qubit chip could radically accelerate the field; if not, they still remain
deeply involved via Azure in supporting the broader ecosystem.
D-Wave Systems: D-Wave, based in Canada, took a different path with quantum annealing mach
ines. These are not general gate-model computers but rather specialized analog devices for solvi
ng optimization problems by exploiting quantum fluctuations. D-Wave was the first company to s
ell quantum computers – they’ve delivered systems to Lockheed Martin, Google/NASA (in the ear
ly 2010s), and more recently to national labs like Los Alamos and Forschungszentrum Jülich. The
latest machine, Advantage, has more than 5,000 qubits and 15-way connectivity between them
(D-Wave Delivers 5000-qubit System; Targets Quantum Advantage). These qubits are supercon
ducting flux qubits that operate at (much higher connectivity than earlier D-Wave models with 6 o
r 8 connections per qubit) (D-Wave Delivers 5000-qubit System; Targets Quantum Advantage).
Although 5,000 qubits sounds impressive, the computational model is different – those qubits rep
resent variables in an Ising-model problem and they operate collectively to find low-energy soluti
ons (a kind of quantum parallel search). In practice, D-Wave’s systems have been used to experi
ment with portfolio optimization, scheduling, protein folding approximations, and machine learnin
g tasks like sampling Boltzmann distributions. There is debate on whether D-Wave’s quantum ann
ealing provides speedups over classical algorithms; some problems see gains, others do not, and
often one must tune the problem carefully. Regardless, D-Wave has the distinction of offering the
largest quantum processors by qubit count and has developed a lot of expertise in cryogenic en
gineering and fabrication. Interestingly, D-Wave recently announced plans for a gate-model quan
tum computer of their own (called the Conductor program), aiming to leverage their flux qubit tec
h in a gate-based context, possibly around 2026. D-Wave’s role in the industry is significant for c
ommercializing quantum tech early – they essentially created a niche for quantum annealing. As
quantum gate-model machines grow, annealers might find complementarity in certain combinato
rial optimization tasks or serve as a testbed for quantum-classical hybrid algorithms (many users
run a hybrid solver mixing D-Wave quantum runs with classical heuristics). For instance, Volkswa
gen has used D-Wave for traffic flow optimization, and companies like DENSO for factory schedu
ling (D-Wave unveils its most powerful quantum computer to date - Fortune). While annealers ca
nnot run Shor’s or arbitrary algorithms, they remain a part of the quantum computing landscape,
especially for benchmarking optimization problems.
Academic and Government Research: Beyond these companies, governments and universities w
orldwide drive quantum computing R&D through large initiatives:
The U.S. National Quantum Initiative (NQI) has established research centers (e.g. SQMS at F
ermilab for superconducting quantum materials, Q-NEXT for enabling technologies, etc.) and
increased funding to labs like NIST, NASA, and the Department of Energy laboratories. These
centers often collaborate with industry (IBM, Google, etc.) and academia to push specific go
als like improved qubit coherence (Fermilab’s SQMS, for instance, works with Rigetti and oth
ers and achieved a 2 ms coherence time in a superconducting cavity – useful for bosonic qu
bits).
In the EU, the Quantum Flagship (a 10-year €1 billion program started in 2018) supports a vari
ety of hardware projects: from superconducting (Google’s Munich lab, IQM in Finland) to trap
ped ions (University of Innsbruck, Sussex IonQA) to photonics (France’s Quandela, UK’s ORC
A Computing) and even spin qubits (Europe is strong in semiconductor spins with groups at
Delft, CEA France, etc.). Academic consortia in EU have built 50-qubit superconducting testb
eds and are exploring unique approaches like distributed quantum computing where small
modules are entangled via photonic links.
China has invested heavily via its Chinese Academy of Sciences and universities like USTC.
They have demonstrated competitive results (quantum supremacy experiments in both photo
nics and superconducting circuits) and reportedly are working on their own 100+ qubit super
conducting processors. China also leads in some aspects of quantum communication and n
etworks (e.g. the longest quantum key distribution network), which ties into future quantum c
omputing networking. Government backing in China has been very strong, aiming at reducin
g reliance on Western tech and achieving leadership in strategically important quantum tech.
Academia continues to produce fundamental breakthroughs: For example, academic researc
hers pioneered many of the QEC codes now implemented (surface code came from academi
a before industry picked it up; bosonic codes were largely developed by academics at Yale,
etc.). Universities also train the quantum workforce that companies hire. Notable academic m
ilestones in the past decade include the first transmon qubit with >0.1 ms coherence (Yale), t
he first logic between distant ion traps (Maryland), and the first 2D array of neutral atoms wit
h high-fidelity gates (Harvard/MIT). As another example, a team at Delft demonstrated a smal
l network of 3 quantum processors entangled in 2022 – hinting at how modular quantum net
works might form a quantum internet in the future. Such academic experiments, often in coll
aboration with companies, push the envelope in ways a single corporate roadmap might not.
DARPA and other agencies have been funding high-risk, high-reward projects like Microsof
t’s topological qubit (via the US2QC program) (Microsoft unveils Majorana 1, the world’s first
quantum processor powered by topological qubits - Microsoft Azure Quantum Blog), as well
as exotic ideas like optically controlled spin qubits, or thorough benchmarking to quantify pro
gress.
All these efforts feed into a healthy quantum technology ecosystem. It’s often said we are now in the
“transition from the NISQ era to the fault-tolerant era.” NISQ (Noisy Intermediate-Scale Quantum) ref
ers to the current generation of devices that are not error-corrected and have tens to hundreds of qu
bits with imperfect operations (IBM Quantum Computers: Evolution, Performance, and Future Directi
ons). We are trying to extract practical value from NISQ devices via clever algorithms and error mitiga
tion, while simultaneously building the foundation (QEC, hardware scaling) for the future fault-tolerant
devices. Industry leaders all have their eyes on demonstrating a clear quantum advantage for a usef
ul task as the next milestone. This could happen in different domains:
Cryptography/Cryptanalysis: On the flip side of the security threat, quantum computers might b
e used by governments for codebreaking (as we discussed with RSA). It’s a major anticipated ap
plication once enough scale is reached.
Chemistry and Materials: Simulating quantum systems (molecules, materials) is a natural applica
tion of quantum computers. Even a modest fault-tolerant quantum computer with ~100 logical qu
bits could potentially simulate chemical reactions that are beyond today’s supercomputers. Comp
anies like IBM, Google, and startups like QC Ware, Zapata, etc., are working with chemical and ph
arma firms to identify likely first targets (for instance, simulating the reaction mechanism of a cat
alyst, or the electronic structure of a complex molecule like FeMoco relevant to fertilizer producti
on). This application might become practical in the next decade as hardware and algorithms impr
ove.
Optimization: Many industries have hard optimization problems (scheduling, routing, allocation).
Quantum annealers like D-Wave target these, but also gate-model algorithms (QAOA, Grover-bas
ed search, variational algorithms) are being explored on NISQ devices. It’s still an open question
how much quantum computers will outperform classical heuristics for optimization, but research
is ongoing. Companies like Volkswagen, BMW, and FedEx have run pilots with quantum solvers
(both annealers and gate-model via cloud).
Machine Learning: Quantum machine learning is a hot topic, though it’s speculative how it will pa
n out. Ideas include quantum data encoding providing better representations, or quantum kernels
for ML algorithms. Startups (TensorQ, etc.) and big companies are investigating if quantum can s
peed up certain linear algebra subroutines or sampling tasks in ML. We have not yet seen a quant
um ML advantage, but it’s an area of interest especially as quantum hardware grows.
Finance: Financial institutions are testing quantum algorithms for portfolio optimization, option pri
cing (using quantum amplitude estimation, a quadratic speedup over Monte Carlo). For example,
JPMorgan and IBM collaborated to run small quantum experiments on option pricing with a Grove
r-like approach. The speedups will require fault-tolerant machines to really beat classical method
s, but the financial sector is heavily investing to be ready for when that happens, since even a mo
derate speedup in Monte Carlo could save huge compute costs.
Looking at future projections, one common thread is that we expect a period where error-corrected l
ogical qubits are scarce. Perhaps by ~2030 we have a few tens of logical qubits operational. In that r
egime, algorithms have to be optimized to use minimal qubits (there’s talk of algorithmic breakthroug
hs to reduce qubit requirements for chemistry and so on). By mid-2030s, if progress continues, we m
ight have hundreds of logical qubits, opening the door to more ambitious algorithms like Shor’s factori
ng of 2048-bit RSA or precise chemical simulations for drug discovery. Some government forecasts, l
ike a 2021 U.S. National Academies report, projected that breaking RSA-2048 may be 20 years out (w
hich aligns with the expert consensus we cited, around late 2030s) and that significant applications
might emerge once machines exceed ~1,000 logical qubits.
In terms of real-world impact, the first major impact might actually be forcing an overhaul of cyberse
curity (via PQC) before quantum computers ever crack a code – a somewhat ironic outcome where th
e threat causes change even while the technology is still incubating. Beyond that, once quantum com
puters solve a valuable problem that classical computers definitively cannot, we’ll see a surge in dem
and and likely a quantum computing “race” not unlike the space race. Some experts suggest quantu
m computing could become a utility service in the cloud, where businesses access quantum processi
ng for specific tasks (just as they do for GPUs now). IBM uses the term “Quantum Utility” to refer to t
he point at which quantum computing delivers practical value that justifies its use in industry (Quantu
m Computing Advances in 2024 Put Security In Spotlight).
It’s important to temper excitement with realism: even optimistic projections say we have ~5-10 years
before fault-tolerant quantum computers are running algorithms that outperform classical ones in pra
ctical terms (NIST Releases 3 Post-Quantum Standards, Urges Orgs to Start PQC Journey). The curr
ent state is that no practical problem has seen a quantum speedup yet (Qubit Contenders - Heidelbe
rg Laureate Foundation ). But the trend lines are very encouraging – every year, qubits get better and
more numerous, and quantum software improves. The collaborations among industry, academia, and
government mean that breakthroughs (like error-corrected gates, new qubit types, etc.) are quickly i
mplemented across the field.
In conclusion, the current state of quantum computing is one of rapid advancement and convergenc
e of efforts from many sectors. Technologically, we have prototype quantum processors that are begi
nning to scale into the 100s of qubits, with early error correction demonstrating viability (Google clai
ms milestone in quantum error correction • The Register) (Google claims milestone in quantum error
correction • The Register). Theoretically, we have a much clearer understanding of what it will take to
achieve fault tolerance (thousands of physical qubits per logical, high thresholds, good decoders, et
c.) and no show-stoppers have emerged – it’s a matter of hard engineering. In parallel, we’re reinventi
ng our cryptographic infrastructure to be safe for the quantum age (NIST Announces First Four Quan
tum-Resistant Cryptographic Algorithms | NIST) (NIST Releases 3 Post-Quantum Standards, Urges
Orgs to Start PQC Journey). Industry leaders are pouring resources into this field, creating a virtuous
cycle of competition and innovation. Over the next decade, we can expect quantum computers to tra
nsition from laboratory curiosities to valued tools for specific domains, and then eventually to broad-i
mpact machines that can tackle problems once thought impossible. The race is on, and as this analys
is shows, the groundwork for the quantum future is actively being laid today. Every qubit added, ever
y error corrected, every new algorithm invented is bringing that future closer.
References: This report has cited the latest research and expert commentary to substantiate the poin
ts made. Key sources include academic papers (e.g. Gidney & Ekera’s factoring cost estimates (How t
o factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits – Quantum) (How to factor 20
48 bit RSA integers in 8 hours using 20 million noisy qubits – Quantum)), industry news (such as Goo
gle’s Quantum AI blog on the Willow chip (Meet Willow, our state-of-the-art quantum chip), Microsof
t’s Azure announcement of Majorana 1 (Microsoft unveils Majorana 1, the world’s first quantum proce
ssor powered by topological qubits - Microsoft Azure Quantum Blog)), cybersecurity analyses (Dark
Reading’s coverage of quantum threat timelines (Quantum Computing Advances in 2024 Put Securit
y In Spotlight) and post-quantum migration (NIST Releases 3 Post-Quantum Standards, Urges Orgs t
o Start PQC Journey)), and overviews from scientific outlets (e.g. HLF’s summary of qubit technologi
es (Qubit Contenders - Heidelberg Laureate Foundation ) (Qubit Contenders - Heidelberg Laureate F
oundation )). These references, embedded throughout the text, provide further detail and credibility f
or the information presented. Each citation is formatted as 【source†lines】, where the source corres
ponds to a specific document or article in the reference list, and the line numbers indicate the exact c
ontext. The reader is encouraged to consult these sources (listed below) for deeper exploration of an
y topic discussed.
Powered by https://markdowntoimage.com