3. TechRepublic.com. October 20, 2021. Veronica Combs. Quantum reality check: Gartner expects more 10 years of hype but CIOs should start finding use cases now.
Hope, Hype and Realism on the Quantum Computing Frontier
By Global Trends Editor Group
Finally, almost 80 years after Edward Teller sketched out the details of nuclear fusion technology, scientists at Lawrence Livermore National Laboratory recently announced a successful controlled nuclear fusion experiment that barely yielded more energy than it consumed.
This was an important milestone on a path that has taken many decades and cost many billions of dollars. But even now, it is likely that we are still two or three decades away from commercially viable electrical generation using controlled nuclear fusion.
And importantly, by the time this technology is ready for full-scale deployment, the economics of nuclear fission will make fusion power a tough sell in a competitive marketplace for run-of-the-mill electricity.
This cautionary tale is relevant to how we think about today¡¯s most daring and cutting-edge technology: quantum computing.
Unlike controlled fusion, which was already seen as a marginally high-risk technology in 2023, our book Ride the Wave treated quantum computing as a medium-risk reality, much like artificial intelligence or synthetic biology.
Ten years and billions of dollars later, the promise and problems of quantum computing have become more obvious. For some, the intervening years have only made the allure of quantum computing more intoxicating. But for many others, this experience has been quite sobering.
Let¡¯s consider where we stand and what this portends for investors, users, and policymakers.
In the past year, quantum computing seems to have reached ¡°peak hype,¡± particularly when it comes to claims about how it will be commercialized. To hear end-users and investors talk, you¡¯d think big payoffs are just around the corner.
And admittedly, useful applications for quantum computers do exist. Perhaps the best-known is Peter Shor¡¯s 1994 theoretical demonstration that a quantum computer can solve the hard problem of finding the prime factors of large numbers exponentially faster than all classical approaches.
Prime factorization is at the heart of breaking the universally used RSA cryptography scheme. That¡¯s why Shor¡¯s factorization approach immediately attracted the attention of national governments everywhere, leading to considerable quantum-computing research funding.
But there was just one big problem: actually, building a quantum computer that could do this. That requires implementing an idea pioneered by Shor and others called quantum-error correction; that¡¯s a process which compensates for the fact that quantum states disappear quickly because of environmental noise.
That phenomenon is called ¡°decoherence.¡± Back in 1994, scientists thought that such error correction would be easy because physics allows it. But in practice, it is extremely difficult to achieve.
The most advanced quantum computers today have dozens of decohering (or ¡°noisy¡±) physical qubits. Building a quantum computer that could crack RSA codes out of such components would require millions, or perhaps billions, of qubits.
Only tens of thousands of so-called logical qubits would be used for computation; the rest would be needed for error correction, compensating for decoherence.
What does this mean? According to quantum researcher, Sankar Das Sarma, the qubit systems we have in today¡¯s laboratories bring us no closer to having a quantum computer that can solve problems that anybody cares about than ¡°trying to make today¡¯s best smartphones using vacuum tubes from the early 1900s.
You can put 100 tubes together and establish the principle that if you could somehow get 10 billion of them to work together in a coherent, seamless manner, you could achieve all kinds of miracles. What, however, is missing is the breakthrough of integrated circuits and CPUs leading to smartphones.
It took 60 years of very difficult engineering to go from the invention of transistors to the smartphone with no new physics involved in the process.¡± In the case of commercial quantum computing, we¡¯re talking about inventing lots of new physics.
For instance, this might involve bypassing quantum error correction by using far-more-stable qubits, in an approach called topological quantum computing. Microsoft is working on this approach. But it turns out that developing topological quantum-computing hardware is also a huge challenge.
So, what is unclear is whether extensive quantum error correction, topological quantum computing, a hybrid between those two approaches, or something else, yet unimagined, will be the eventual winner.
While this competition sorts itself out, the great difficulty in getting rid of decoherence is leading some experts to consider a technology called NISQ. NISQ stands for ¡°noisy intermediate scale quantum¡± computing. This idea, discussed previously in Trends, uses small collections of noisy physical qubits to do something useful and better than a classical computer can.
However, Sankar Das Sarma like many others isn¡¯t sure this concept will lead to real machines that can solve real problems better than classical computers. He wants to know: How noisy are these systems? How many qubits are involved? And what worthy problems can the proposed NISQ machines actually solve? At this point, he indicates that the answers are ¡°far from clear.¡±
Looking at some of the recently published research, makes clear that the gulf between real solutions and today¡¯s experiments is enormous. A recent laboratory experiment at Google has observed some predicted aspects of quantum dynamics using 20 noisy superconducting qubits, dubbed ¡°time crystals.¡±
The experiment was an impressive showcase of electronic control techniques, but it showed no computing advantage over conventional computers, which can readily simulate time crystals with a similar number of virtual qubits.
It also did not reveal anything about the fundamental physics of time crystals. Meanwhile, other documented NISQ experiments addressed highly specialized tasks which were of no commercial value, whatsoever.
So, despite a constant drumbeat of NISQ-hype coming from various quantum computing startups, the commercialization potential is far from clear. There are vague claims about how NISQ could be used for fast optimization or even for AI training.
However, no expert in optimization or AI can tell us how this might actually work. Some seem to think that since we do not quite understand how classical machine learning and AI really work, it is possible that NISQ could do this even faster. Maybe that¡¯s true, but this sounds more like ¡°HOPIUM¡± than real science or technology.
Notably, this NISQ-based ¡°Hopium addiction¡± is not only impacting those pursuing optimization or cryptography breakthroughs. There are also proposals to use small-scale quantum computers for drug design, as a way to quickly calculate molecular structure. While this is possible, it seems unlikely to prove revolutionary since quantum chemistry represents a minuscule portion of the whole drug design process.
Experts in quantum mechanics are also perplexed by claims that near-term quantum computers will help in finance.
No technical papers convincingly demonstrate that small quantum computers, let alone NISQ machines, can lead to significant optimization in
- algorithmic trading,
- risk evaluation,
- arbitrage,
- hedging,
- asset trading,
- risk profiling or
- targeting and prediction.
However, this lack of business-related evidence has not prevented several investment banks from jumping on the quantum-computing bandwagon. It reminds the Trends editors of the green obsession with wind and solar energy, where all ¡°profits¡± have only come from a combination of tax credits and Chinese dumping.
Obviously, a real quantum computer will have applications unimaginable today, but they aren¡¯t going to materialize without a lot of breakthroughs. Think of this situation as being much like the one that materialized when the first transistor was made in 1947; nobody could foresee how to turn it into a smartphone or a laptop.
Therefore, it makes sense for investors and researchers to temper their enthusiasm and hope. That¡¯s because, like controlled nuclear fusion, quantum computing is a potentially disruptive technology. But how anyone can claim that it will start producing billions of dollars of profit for real companies selling services or products in the near future, is very perplexing.
Those who invest and manage as if that is going to happen, need to explain how it will happen. Quantum computing is indeed one of the most important developments not only in physics, but in all of science. However, ¡°entanglement¡± and ¡°superposition¡± are not magic wands that someone can wave and expect to transform technology in the near future.
Furthermore, people need to recognize that while quantum mechanics is weird and counterintuitive, that by itself does not guarantee revenue and profits.
When will this technology arrive and enhance profitability? Nobody knows. Predicting the future of technology is very difficult. Smart prognosticators agree, ¡°it happens when it happens.¡±
Analogies may help make this point. The first hydrogen bomb test occurred 70 years ago in 1952. At the time, electricity from controlled fusion was expected to be 20 years away. Today, it may still be 20 years away. Or it may take longer.
Controlled fusion, general artificial intelligence and quantum computing will all transform industries when they arrive. But those who build their world around expecting them to arrive on schedule are likely to be very disappointed.
Given this trend, we offer the following forecasts for your consideration.
First, before 2032, the world will see few, if any benefits from quantum computing technology.
Quantum key distribution for military applications and financial networks may be deployed in that time frame, but other commercial quantum computing applications are unlikely within the next two decades.
Second, quantum computing will go through a trough of disillusionment and despair around 2027.
History has shown that adoption of almost every technology-based innovation follows the classic ¡°hype-cycle¡± model develop by the Gartner Group. After emerging on analysts¡¯ radar roughly 15 years ago, the excitement surrounding quantum computing has risen to a crescendo, even though tangible results are few and far between.
Because of this disconnect, it¡¯s inevitable that venture capitalists, pundits and many in the general public will lose faith. Start-ups and corporate research efforts will be scaled back or shuttered. However, the long slog toward commercialization will continue and eventually bear fruit. And,
Third, widespread commercial applications outside cryptography and communications will take decades to emerge.
Not only is the hardware proving exceedingly hard to build and operate, but useful algorithms are proving especially hard to formulate. In contrast, early mainframes succeeded because many batch transaction jobs were high value.
For quantum computers, finding applications where quantum dominance creates huge value and can be achieved with ¡°first generation machines¡± will be key. At present, classical computers can solve most problems that really make a difference, and those problems that only quantum computers can solve will require machines that can¡¯t be built in the foreseeable future.¡±
Resource List
1. MIT Technology Review. March 8, 2022. Sankar Das Sarma. Quantum computing has a hype problem.
3. TechRepublic.com. October 20, 2021. Veronica Combs. Quantum reality check: Gartner expects more 10 years of hype but CIOs should start finding use cases now.