By Tristan Greene
Quantum computing is one of the most exciting technologies there is, but its basis in quantum physics makes it a pain in the ass to understand and even harder to do anything with. A recent breakthrough in physics research, however, might change all of that and start a computing revolution.
It wouldn’t be the first time this has happened.
IBM’s Thomas J Watson (the person the Watson AI was named after) famously said “I think there is a world market for maybe five computers,” in 1943. That’s probably because, at the time, a computer filled up an entire room.
But, in 1971 that changed with the development of the world’s first microprocessors. Fast-forward to 1975, and the birth of the MITS Altair 8800 ushered in the personal computing age. It also inspired a young Bill Gates – who worked as a software engineer for MITS – to found a little startup called Microsoft.
Here we are now, decades removed from the naysayers who thought classical computers would never catch on, and we’ve got more raw processing power in our phones than all of the supercomputers that ran the Apollo space program combined.
Quantum computing has, so far, followed a somewhat similar trajectory.
Currently IBM, Google, Microsoft, Rigetti, and a handful of other companies have quantum computing systems that are very much like the old room-sized supercomputers of last century. They’re huge, they require obnoxious amounts of power, and they’re only feasible in a laboratory environment.
And there is no shortage of researchers, tech journalists, and experts who’ll tell you that quantum computers will never be feasible electronics for consumers. If you listen to these people, you might think there’s only a worldwide market for about five of these giant systems.
But, just like the invention of the microprocessor, scientists in the quantum computing field may have found their eureka moment in recently published physics research conducted by an international team of scientists.
In a paper titled “Using Machine Learning for Scientific Discovery in Electronic Quantum Matter Visualization Experiments” the team explores a 20 year-old hypothesis that could lead to the creation of a room-temperature superconductor.
The researchers, who hail from prestigous instituations such as Cornell, Harvard, Université Paris-Sud, Stanford, University of Tokyo and other centers of academia, set out to determine why superconductors only conduct electricity at extremely low temperatures.
There’s a physics problem with superconductors called “cuprates” that nobody has been able to figure out yet. It basically says that as a cuprate’s temperature is lowered to the point where it can conduct, it enters a mysterious state called a “psuedogap” wherein researchers aren’t able to determine what’s happening. According Nature, revealing what’s actually happening in the psuedogap is the key to understanding the whole process:
Complex interactions between electrons and atoms make the pseudogap theoretically difficult to describe, and its chaotic nature challenging to observe. Some physicists call the state the cuprates’ ‘dark matter’, yet explaining the pseudogap may be key to understanding superconductivity.
Humans simply can’t “see” what’s going on with the matter as it goes through state changes, and even under direct observation there’s no chance a person could make heads or tails of what they’re seeing.
So the team created a machine learning paradigm that could figure out if the above image showed information which supported one hypothesis (cuprates’ psuedogap is the result of strong interactions between particles) or another (it’s the result of weakly interacting waves).
The result? According to the AI, the behavior of the psuedogap more closely resembles the particle-like hypothesis than the wave-like one. Unfortunately, there was no “C” option, so this work isn’t definitive by any means. The algorithms had to pick between the two hypothesis: Neural networks aren’t smart enough to come up with their own.
But, a deeper understanding of how superconductivity works could lead to the development of a “microprocessor” for quantum computers. Wrangling qubits isn’t quite the same thing as ordering logic gates, but this research goes a long way towards clearing the fog-of-war that obstructs further development of quantum computers.
It’s not the solution to the “how do we make quantum computers work without needing to keep them at near perfect-zero temperatures” problem, but it’s a start. Optimists might consider this work a snowball that could cause a quantum computing avalanche.
Tristan Greene is a sailor gleefully writing about consumer-friendly artificial intelligence advances and political policy concerning tech.
Stay updated with all the insights.
Navigate news, 1 email day.
Subscribe to Qrius