Quantum Leap for Computing Is a Small Step for Computers

Last month, quantum computing achieved a controversial milestone, but it's unclear whether Google's quantum supremacy is a meaningful milestone in the quest to build a useful quantum computer.

Bloomberg News

November 15, 2019

5 Min Read
quantum computing visual
Getty Images

(Bloomberg Opinion) -- In a landmark paper published in 1950, the mathematician Alan Turing proposed the eponymous Turing Test to decide whether a computer can demonstrate human-like intelligence. To pass the test, the computer must fool a human judge into believing it’s a person after a five-minute conversation conducted via text. Turing predicted that by the year 2000, a computer would be able to convince 30% of human judges; that criterion became a touchstone of artificial intelligence.

Although it took a bit longer than Turing predicted, a Russian chatbot presenting itself as a 13-year-old Ukrainian boy named Eugene Goostman was able to dupe 33% of judges in a competition held in 2014. Perhaps the cleverest aspect of the machine’s design was that its teenage disguise made it more likely that people would excuse its broken grammar and general silliness. Nevertheless, the strategy of misdirection comes across as transparent and superficial in conversations the chatbot had with skeptical journalists — so much so that one marvels not at the computer’s purported intelligence, but at the gullibility of the judges. Sadly, conquering the Turing Test has brought us no closer to solving AI's big problems.

Last month, quantum computing achieved its own controversial milestone. This field aims to harness the laws of quantum mechanics to revolutionize computing. Classical computers rely on memory units called bits that encode either zero or one, so a state of the memory is a sequence of zeros and ones. Quantum computers, by contrast, use qubits, each of which encodes a “combination” of zero and one. In a quantum computer, multiple qubits interact, which means that each of the exponentially many sequences of bits is represented simultaneously.

The key question is whether this strange power can be exploited to perform computations that are beyond the reach of classical computers. Demonstrating even one such computation, however contrived, would lead to “quantum supremacy” — a term coined by physicist John Preskill of the California Institute of Technology in 2012

By this standard, Google appears to have achieved quantum supremacy. Specifically, the company said in October that its team used a 53-qubit quantum computer to generate random sequences of bits, which depend on controlled interactions between its qubits. By Google’s calculations it would take 10,000 years to carry out the same task using classical computation. There is no doubt that controlling a 53-qubit quantum computer is a feat of science and engineering. As Preskill put it, “the recent achievement by the Google team bolsters our confidence that quantum computing is merely really, really hard,” rather than being “ridiculously hard.”

As long as Google’s quantum computer works as intended, however, its dominance isn’t surprising — because the competition is rigged. It’s a bit like building a robotic hand that flips coins according to given parameters (such as, totally off the top of my head, the angle between the normal to the coin and the angular momentum vector), and then challenging a classical computer to generate sequences of heads and tails that obey the same laws of physics. This robot hand would perform astounding feats of coin-flipping but wouldn’t be able to do kindergarten arithmetic — and neither can Google’s quantum computer.

It’s unclear, therefore, whether quantum supremacy is a meaningful milestone in the quest to build a useful quantum computer. To mention just one major obstacle (there are several), reliable quantum computing requires error correction. The catch is that quantum error correction protocols themselves demand fairly reliable qubits — and lots of them.

In some ways, quantum supremacy is akin to iconic AI milestones like the Turing Test, or IBM’s chess victory over Gary Kasparov in 1997, which was also an engineering tour de force. These achievements demonstrate specialized capabilities and garner widespread attention, but their impact on the overarching goals of their respective fields may ultimately be limited.

The danger is that excessive publicity creates inflated expectations of an imminent revolution in computing, despite measured commentary from experts. AI again provides historical precedent: The field has famously gone through several AI winters — decades in which talent fled and research funding ran dry — driven in large part by expectations that failed to materialize.

Quantum computing research started three decades after AI, in the 1980s, and experienced a burst of excitement following the invention in 1994 by the Massachusetts Institute of Technology mathematician Peter Shor of a quantum algorithm that would, in theory, crush modern cryptography. But eventually the dearth of, well, quantum computers caught up with quantum computing, and by 2005 the field was experiencing a massive downturn. The current quantum spring started only a few years ago; its signs include a surge of academic research as well as major investments by governments and tech giants like Alphabet Inc., International Business Machines Corp. and Intel Corp.

Quantum computing and AI are two distinct fields — despite what whoever came up with the name Google AI Quantum would have you believe — and what is true for one isn't necessarily true for the other. But quantum computing can learn from AI's much longer career as an alternatively overhyped and underappreciated field. I am tempted to say that the chief lesson is “winter is coming,” but it is actually this: the pursuit of artificial milestones is a double-edged blade.

About the Author

Bloomberg News

The latest technology news from Bloomberg.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like