Google’s Sycamore chip powers a quantum computer

 The era of practical quantum computers has begun — at least on one speed test showing “quantum supremacy.”

A Google quantum computer has far outpaced ordinary computing technology, an achievement called quantum supremacy that’s an important milestone for a revolutionary way of processing data. Google disclosed the results in the journal Nature on Wednesday. The achievement came after more than a decade of work at Google, including the use of its own quantum computing chip, called Sycamore.

“Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output,” Google researchers said in a blog post about the work.

The news, which leaked into the limelight in September with a premature paper publication, offers evidence that quantum computers could break out of research labs and head toward mainstream computing. They could perform important work like creating new materials, designing new drugs at the molecular level, optimizing financial investments and speeding up delivery of packages. And the quantum computing achievement comes as progress with classical computers, as measured by the speed of general-purpose processors and charted by Moore’s Law, has sputtered.

Google’s Sputnik moment

Google got to pick its speed test, but Hartmut Neven, one of the researchers, dismissed criticisms that the result is only a narrow victory.

“Sputnik didn’t do much either. It circled the Earth. Yet it was the start of the space age,” Neven said at a press conference. He spoke at Google’s quantum computing lab near Santa Barbara, California, next to the site of an actual Space Race milestone — the development of the Apollo missions’ lunar rover.

But it’s not the beginning of the end for classical computers, at least in the view of today’s quantum computing experts. Quantum computers are finicky, exotic and have to run in an extremely controlled environment, and they’re not likely to replace most of what we do today on classical computers.


Take a look at Google’s quantum computing technology

Instead, quantum computers will function as accelerators for classical machines, useful enough to be essential. “It will be a must-have a resource at some point,” Neven said.

Quantum computing researcher Scott Aaronson likened the step to landing on the moon in terms of momentousness. And in a tweet Wednesday, Google Chief Executive Sundar Pichai called it a “big breakthrough.” Exemplifying how hard the work is, though, the quantum supremacy paper has a whopping 77 authors.

A vast industry is devoted to improving classical computers, but a small number of expensive labs at companies such as Google, Intel, Microsoft, Honeywell, Rigetti Computing and IBM are pursuing general-purpose quantum computers, too. They’re finicky devices, running in an environment chilled to just a hair’s breadth above absolute zero to minimize the likelihood they’ll be perturbed. Don’t expect to find a quantum computer on your desk.

Google’s speed test has applications to computing work like artificial intelligence, materials science and random number generation, the paper said.

And already, Google’s quantum computing researchers are talking to Google security team members about how the random number technology could be used to generate encryption keys, said Dave Bacon, leader of the quantum computing software effort.

Google’s first customers — the US Department of Energy and automakers Daimler and Volkswagen — will be able to use the machine in 2020, Google said. As with IBM’s quantum computing effort, it’ll be available as a cloud computing service over the internet.

However, physicist Jim Preskill, who came up with the term “quantum supremacy” in 2012, dashed some cold water on that idea. Google’s chosen test is good for showing quantum computing speed but “not otherwise a problem of much practical interest,” Preskill said in October after the paper’s premature release.

Quantum vs. classical computers

Nearly every digital device so far, from ENIAC in 1945 to Apple’s iPhone 11 in 2019, is a classical computer. Their electronics rely on logic circuits to do things like add two numbers and on memory cells to store the results.

Google quantum computer looks nothing like a conventional machine. When running, all this complexity is hidden away and refrigerated to near absolute zero.

Quantum computers are entirely different, reliant instead on the mind-bending rules of physics that govern ultrasmall objects like atoms.

Where classical computers store and process data as individual bits, each a 1 or a 0, quantum computers use a different foundation, called a qubit. Each qubit can store a combination of different states of 1 and 0 at the same time through a phenomenon called superposition. Told you it was weird.

Not only that, but multiple qubits can be ganged together through another quantum phenomenon called entanglement. That lets a quantum computer explore a vast number of possible solutions to a problem at the same time.

Exponential speedups

In principle, a quantum computer’s performance grows exponentially: add one more qubit, and you’ve doubled the number of solutions you can examine in one fell swoop. For that reason, quantum computing engineers are working to increase the number of qubits in their machines.

“We expect that their computational power will continue to grow at a double-exponential rate,” the Google researchers said in their paper. That’s even faster than the single exponential improvement charted for classical computer chips by Moore’s Law.

Google’s machine had 54 qubits, though one wasn’t working right, so only 53 were available. That happens to match the number in IBM’s most powerful quantum computer.

But qubit count isn’t everything. Unavoidable instabilities cause qubits to lose their data. To counteract that problem, researchers are also working on error correction techniques to let a calculation sidestep those problems.

IBM challenges Google’s quantum results

IBM is a major quantum computing fan, but it questioned Google’s prematurely released results in a blog post Monday.

“We argue that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity,” the IBM researchers wrote. They suggested different algorithms and a different classical computer design in a preprint paper of their own.

Google said it welcomes improvements to quantum computer simulation techniques but said its overall result is “prohibitively hard for even the world’s fastest supercomputer, with more double exponential growth to come. We’ve already peeled away from classical computers, onto a totally different trajectory.”

And you can try for yourself if you like. Google released its quantum computer’s raw output to encourage others to see if they can do better at simulating a quantum computer. “We expect that lower simulation costs than reported here will eventually be achieved, but we also expect that they will be consistently outpaced by hardware improvements on larger quantum processors,” the Google researchers said.

Intel didn’t offer an opinion on Google’s results, but did say quantum supremacy is “a strategic benchmark.”

“We are committed to moving quantum from the lab to commercialization,” said Jim Clarke, Intel Labs’ director of quantum hardware, in a statement.

Cracking your encrypted communications? Not yet

One quantum computing ability, mathematically proved with an idea called Shor’s algorithm, is cracking some of today’s encryption technology.

However, that will require vastly larger quantum computers and new technology breakthroughs to deal with error correction.

“Realizing the full promise of quantum computing (using Shor’s algorithm for factoring, for example) still requires technical leaps,” the researchers said in their paper.

And at the same time, the US government and others are working on “post-quantum” cryptography methods to withstand quantum computing cracking abilities.

So for now at least, quantum computing, while radically different, isn’t blowing up the tech industry.

Via Cnet.com