Any day now, Google is expected to achieve quantum supremacy—the use of a quantum computer to solve a problem that even the most advanced supercomputer can't unravel. That milestone, which Google has said it will reach by year-end, will no doubt be greeted with headlines proclaiming the dawn of the quantum computing age. Prepare for lots of stories about how quantum computing will soon do everything from inventing wonderful new pharmaceuticals and almost-magical new materials (good) to rendering obsolete all existing public-key encryption (not so good).
There's plenty of momentum. Earlier this month, Intel Corp. researchers unveiled a superconducting chip for quantum computers. The news follows several other advances in quantum computing over the past two years—from tech big boys like International Business Machines Corp., and Google, which is owned by Alphabet Inc., as well as Canada's D-Wave Systems Inc., the only company to sell a commercial quantum computer (it has sold four) and startups like Rigetti. Google itself just released software to make it easier for chemists and material scientists to use the quantum machines it and others have built.
Investors are getting excited. In June, Blue Yard Capital, a venture capital firm based in Berlin, held a conference on quantum computing in Munich that drew many other investors and large enterprises hoping to cash in on the new technology. Several other VCs I've spoken to in the past few weeks have also bragged about their plans to get in early on what seems like the bleeding edge of computer science.
But some fear that rather than a moment of triumph, Google's quantum supremacy announcement may backfire. They fret that the search giant's PR machine and the journalistic catnip of the phrase will lead to inflated expectations about what the technology can do—and result in inevitable disillusionment when those expectations are dashed. (To be fair, Google didn't coin the term quantum supremacy; John Preskill, a CalTech researcher did. But there's no denying Google's marketing muscle is helping to popularize it.)
"If you had never heard that phrase before, you would think that we had entered a time when a quantum computer will do everything better than an ordinary computer, and that is far from the case," says Simon Benjamin, a professor of quantum technologies at the University of Oxford. "We cannot put a quantum computer on a table and have it do something useful that other computers can't do." In fact, not only can today's quantum computers not exceed the performance of conventional supercomputers, for many tasks they actually perform worse than a standard laptop.
Scientists study enriched silicon, used in quantum computing — Photo: Mike Thewalt
Classical computers use tiny silicon transistors to process data in a binary form called a bit, which is either a 0 or 1. Quantum computers, in contrast, process information in quantum bits, or qubits. A qubit—there are lots of different ways to design them, but most involve superconducting materials that have to be cooled to temperatures below those found in deep space—take advantage of quantum mechanics to do some mind-bending stuff. For instance, they can be both a 0 and 1 at the same time. In a classical computer each bit functions independently from the others, but in a quantum computer, a qubit's status depends on the status of other qubits. These properties, in theory, give a quantum computer exponentially more power.
The problem is the quantum computers we've built so far simply aren't very big. D-Wave's latest has 2,048 qubits and it says it is working on a 4,000 qubit model—but the machine is really only good for solving optimization problems. Google will demonstrate quantum supremacy using a machine that has either 49 or 50 qubits, according to spokeswoman Charina Choi. That's just big enough to do something a standard computer can't. (Or so everyone thought until last week, when IBM, perhaps seeking to move the goalposts on its rival, published a paper claiming it managed to simulate a 49-qubit system on a conventional machine.) Even so, it it isn't big enough, Benjamin says, to do anything most people would consider useful. (Google has strongly hinted that it will use a problem from quantum computing itself to demonstrate quantum supremacy.)
And there's another problem with quantum computers. Qubits are inherently fragile—they can't remain in a quantum state for long (for qubits made of superconductors, we're talking tens of microseconds; for those using trapped ions, the record is 10 minutes.) As qubits degrade, errors creep into their calculations—the more qubits, the more errors. These errors can be corrected, either with additional qubits or with software, but doing so can consume so much computing power that it negates the advantage of using a quantum computer in the first place. (Microsoft Corp. is working on a radically different design for a quantum computer that is inherently less error-prone, but it is based on sub-atomic particles that some physicists aren't convinced even exist and it is likely to be a decade before that machine, if it even works, is available commercially.)
So, for most applications, traditional computers still have a significant edge and likely will for years to come. Even those bullish on technology, including Google itself, think it will be another decade before either error-corrected or error-free quantum computing is possible. Benjamin fears that this decade-long gap is wide enough to be a black hole for investors' cash—and that if enough VCs lose enough money, quantum computing could experience a backlash similar to the "AI Winters' that hobbled artificial intelligence research in the 1970s and again in the 1990s.
Benjamin says he doesn't want to denigrate Google's achievement. If the company can get its quantum computer to do something a standard one can't, it will still be an important milestone. "It deserves to be written about and celebrated that we are into uncharted territory," he says. But, rather than supremacy, Benjamin says it would be more accurate to describe Google's imminent breakthrough as quantum inimitability. Then again, as Benjamin admits, that just doesn't have quite the same ring to it, does it?