IBM today announced the largest working quantum computer yet, with over 1,000 qubits – the hardware bits that make quantum computing magic possible.
While IBM is concentrating on the hardware and its super-cooled supporting infrastructure, other companies – including several startups – are providing parts of the software ‘stack’, like circuits and algorithms, to enable quantum computing to run via the cloud. IBM alone is running many billions of quantum circuits every day, and that’s expected to increase dramatically as the technology matures.
“The future is going to be where trillions of quantum circuits are being executed every day on quantum hardware behind the scenes,” said Dario Gil, IBM’s quantum project lead, “and there’s going to be a tremendous amount of innovation and creativity and intellectual property on these circuits. So you can’t do it all yourself.”
Global data is forecast to reach 175 zettabytes by 2025, and ‘classical’ super-computers and server farms – as managed by all the tech giants – can’t cope anymore; which is why companies like IBM are keen to collaborate and build an ecosystem to support a quantum future. It’s the only way to stay in front of the exponential curve.
Quantum computers are incredibly adept at handling vast simultaneous processing problems, like mass route optimization where every participant affects all the others – all at the same time. Traditional computer clusters take weeks of ‘heavy lifting’ to run all the possible simulations, while quantum results are available in seconds.
But there’s a catch. Qubits are fragile and prone to error. IBM’s hardware is some of the most stable in the world, and even they are struggling to move the error rate from about 1% in 2020 to something closer to 0.0001% by 2025. And we all know that, in today’s warp-speed world, an error rate of even one hundredth of 1% just won’t cut it. Ask SpaceX, when they’re landing a Starship rocket!
Image credit: IBM