Quantum computing has been trying to overcome a limitation inherent in its nature: the most complex calculations require many cubits, their basic information unit, but these are unstable and, to greater number, more errors. This barrier has tried to overcome with better, but so far they were insufficient to guarantee the quantum advantage (more efficient calculations than with any other supercomputer). Jay Gambetta, vice president of IBM, says on Tuesday having found the formula with a conjunction of technologies and programming that allows the development of the, “the first quantum supercommer on a large scale and tolerant of failures of the world.”
The Quantum Starling has begun to build the IBM quantum data complex in Poughkeepsie (New York) and will be operational in four years. It will execute, according to the company, 20,000 times more circuits than current quantum computers and will be able to perform 100 million operations using 200 logical cubits. The physical cubit is the existing one in a team (such as an ion), but it is very unstable and any interference (noise) cancels its ephemeral state. The logic is virtual and is built from several physicists with error correction. It is the one that allows to store and process the information.
A new error correction system, last March, is the one that has led IBM to consider overcome the limitations found with other systems, such as the usual surface code. This is LDPC, acronym in English of low density parity check o Low density parity check code. “This protocol for the correction of quantum errors from extreme to extreme implements failure tolerant memory on the basis of a family of low density parity codes to an error threshold of 0.7% for the standard noise model,” the researchers defend.
This model allows to reduce the load of physical cubits necessary to develop logic. According to research “12 logical cubits can be preserved for almost one million cycles using only 288 physical cubits.” Other systems, such as the aforementioned surface, would require almost 3,000 physical cubits to achieve the same performance.
In this way, the “reduce the necessary overload for the correction of failures by 90%” and opens the door to a stable system of adequate dimensions to think about the quantum advantage. In fact, IBM considers that the future quantum computer will have a four -mead of times more memory than the largest current supercomputer.
President and CEO of IBM, considers that Starling “draws the next border in quantum computing.” “Our experience in mathematics, physics and engineering is raiding the way for a large -scale quantum computer and failure tolerant, one that will resolve the challenges of the real world and unlock immense possibilities for business,” he says.
Starling, according to the company, “will be able to execute algorithms that could drastically accelerate efficiency in all industries, including the development of medicines, the discovery of materials, chemistry, logistics optimization and financial optimization, among many other areas.”
The correction of errors is not the only way to achieve the goal of Starling. They are also precise equipment developments that are already tested at the IBM headquarters. “Throughout the next four years, we will launch increasingly large and interconnected quantum processors and each of them will demonstrate the specific criteria established in IBM’s investigation on how to climb the tolerance to failures. Together, these advances will be combined to become a Starling,” explains the company.
This roadmap includes the following milestones: IBM Quantum Loon (this year), designed to test the architecture components for the LDPC code, including the “couplers” that connect cubits to longer distances within the same chip; Kookaburra (2026), the first modular processor that will combine quantum memory with logical operations; and Cockatoo (2027), which will intertwine two Kookaburra modules and avoid the construction of impracticably large chips.
These advances are the basis for completing the Starling in 2029 and this, in turn, will be the foundation of IBM Blue Jay in 2033, when he will be able to execute, according to Matthias Steffen, researcher of the quantum team of the company, “1 billion quantum operations along 2,000 logical chips”, ten times more powerful than the model that has been announced this Tuesday.
The objective is what is considered the holy grail of quantum computing, waiting to find the Majorana particle, a theoretical approach to an alleged element capable of maintaining coherence and has not yet been identified. It is about “balancing sufficient control and coupling, while quantum coherence is preserved.”
The company, embarked on this goal for more than a decade, has the experience of 80 systems already deployed and operational (a unit), the experience of 600,000 users and the collaboration of 300 scientific, technological and industrial entities around the world.
Gambetta says that, with the new advances, “quantum computing tolerant to failures already a large scale is no longer a matter of science, but an engineering challenge.” “We are sure we can build it: we have architecture, we have the hardware [equipos]we have scientific advances and, now, we see it as an engineering path, “insists the researcher.
And he concludes: “IBM Quantum’s goal is to build this equipment and work with our partners in algorithms with high hopes that it is the future of the quantum industry. We will demonstrate the quantum advantage and, definitely, it will happen in the coming years.”
In the race to reach the effective quantum advantage, IBM is not alone and this 2025, considered by UNESCO as the International Year of Quantum Science and Technology when the centenary of the discoveries that opened this door to the microscopic world, has been filled with ads: Google has presented its Willow chip, which the multinational attributes the ability to solve in five minutes a task It would take four years of years; Microsoft has affirmed that it has found a new state of the matter with which to tame the elusive particle of Majorana; And they have presented in A Ocelot, a new quantum computing processor that, according to the company, can “reduce error correction costs by up to 90%.”
All appeal to have reached achievements that anticipate a new era of computing. However, work co -author on Ocelot, warns: “We are in a long -term search to build a useful quantum computer to do things that even the best supercomputers cannot do, but expanding them is a great challenge.”