2 Sources
2 Sources
[1]
We might finally know how to use quantum computers to boost AI
Quantum computers might eventually be able to handle some AI applications that currently require huge amounts of conventional computing power. Such a development would be a major boost to machine learning and similar artificial intelligence algorithms. Quantum computers hold the promise of eventually being able to complete certain calculations that are impossible for conventional computers. For years, researchers have been debating whether these advantages over conventional computers extend to tasks that involve lots of data, and the algorithms that learn from them - in other words, the machine learning that underlies many AI programs. Now, Hsin-Yuan Huang at the quantum computing firm Oratomic and his colleagues argue that the answer ought to be "yes". Their mathematical work aims to lay the foundations for a future where quantum computers offer a broad boost to AI. "Machine learning is really utilised everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there's massive datasets available," he says. His team's work addresses the key question of how data collected in the non-quantum world, such as restaurant reviews or results from sequencing RNA, could be input into a quantum computer in such a way that the computer's quantumness can be leveraged to process the data, and learn from it, more efficiently. This requires putting all of the data into a "superposition state", which is a mathematical combination that cannot be created in non-quantum machines. But until now, researchers thought that performing this task would be impractical. This is because they assumed that all of the data in that superposition state would have to be saved into dedicated memory devices prior to being processed by the quantum computer - but those memory devices would have had to be impossibly large, says team member Haimeng Zhao at the California Institute of Technology. Huang and his colleagues took a different approach that doesn't require such memories. It involves inputting the data into the quantum computer in smaller batches, without having to save it all before beginning to process it, similar to streaming a movie rather than downloading it in full prior to watching it. They showed not only that this approach can work but that it would allow the quantum computer to process more data at a smaller memory cost than any conventional computer. The memory advantage is so large, in fact, that a quantum computer made from about 300 error-proof building blocks called logical qubits would outperform a classical computer built using every atom in the observable universe, says Zhao. We are maybe many years away from building quantum computers with 300 logical qubits, but Huang says that a 60-logical-qubit computer could plausibly be built by the end of the decade. The team's analysis suggests that, at this size, there would already be a notable quantum advantage over classical computers for some tasks that involve processing large datasets and that AI is used for. "The quantum machine is a very powerful device, but you do need to first feed it. This study talks about feeding and how it's enough to load [data] bit by bit, without overfeeding the beast," says Adrián Pérez-Salinas at ETH Zurich in Switzerland. Nevertheless, he says that many questions about applying the new work to actual devices and real-world data still need to be addressed. Many past quantum machine-learning algorithms were eventually shown to be amenable to "dequantisation", which is a process where the the algorithms were adapted to no longer require any quantum hardware while retaining their excellent performance. It will be important to examine how crucial quantumness is to this new algorithm too, says Pérez-Salinas. Vedran Dunjko at Leiden University in the Netherlands says that the new work could be a good match for large scientific experiments such as at the Large Hadron Collider where millions of gigabytes of data are continuously created, but most of it gets discarded because of insufficient computer memory. But it's likely that only some current AI applications and similar kinds of data processing will be amenable to being handled with a quantum computer rather than with a data centre full of conventional servers, he says. "This is not the majority of what GPUs are heating up the planet for, but may still be important," says Dunjko. The researchers are now working on both expanding the kind of algorithms that their method could be useful for and devising new ways to configure quantum computers that would make them sufficiently fast to handle data not just with very little memory but in a practical amount of time.
[2]
Researchers Use Quantum Computer to Improve AI Predictions
Julian is a contributor and former staff writer at CNET. He's covered a range of topics, such as tech, crypto travel, sports and commerce. His past work has appeared at print and online publications, including New Mexico Magazine, TV Guide, Mental Floss and NextAdvisor with TIME. On his days off, you can find him at Isotopes Park in Albuquerque watching the ballgame. AI models have been helping with predictions for a while now. Doctors, weather forecasters and stock brokers all use AI to try to peek into the future. Inside the Leibniz Supercomputing Centre in Germany, researchers have been experimenting with an AI model and a quantum computer. The quantum computer helps the AI with complex predictions it can't handle alone. The research team from University College London, who published their findings on Friday in the journal Science Advances, say that one day, quantum computers could help AI models make fast, accurate predictions across a range of industries, which would take regular computers weeks to figure out. "The paper demonstrates that for these kinds of studies, even today's relatively small and unreliable quantum devices can enhance the predictions of conventional AI models," Peter Coveney, UCL professor and the study's coauthor, told CNET. Quantum computers differ from regular computers in several ways, including being able to perform simultaneous calculations rather than step-by-step calculations, and using quantum bits. While classical computers use bits as the smallest data unit, with each representing either a zero or a one, qubits can represent both zero and one simultaneously (superposition). Two qubits can also be linked together (entanglement). Superposition and entanglement allow quantum computers to solve complex problems much faster than traditional computers. But quantum computers are incredibly delicate and must be kept at extremely low temperatures, making them impractical for everyday use. But while today's quantum computers are still experimental and often finicky, they might help AI solve big problems that would otherwise be too complicated or time-consuming. The AI model used in the study is housed on a supercomputer connected to the quantum computer at the research center. The team used this setup to predict how gases and liquids in a system would move and interact over an extended period. Climate science, medicine and city engineering all use this kind of modeling. "Our new method appears to demonstrate 'quantum advantage' in a practical way -- that is, the quantum computer outperforms what is possible through classical computing alone," coauthor Maida Wang, a PHD student at UCL, said in an announcement. Quantum computers are incredibly sensitive. Even tiny disturbances in the environment throw off the calculations, so the technology is still mostly used in research labs. Because quantum computing is still limited, the researchers did most of the study with the supercomputer. The AI model handled the data processing, then used the quantum computer for one step. After completing the hard calculations, the quantum computer handed the reins back to the AI model, so it could take care of everything else. "Even today's noisy and error-prone quantum devices can enhance the performance of conventional machine-learning algorithms trained on data from modern supercomputers," Coveney said. Hooking up an AI model to run calculations on a quantum computer might sound outlandish, but there are already real examples of companies using this approach in healthcare. In 2025, Google said its Quantum Echoes algorithm could calculate the structure of molecules that could pave the way for future drug discovery. Also, last year, the University of Toronto and Insilico Medicine used AI with a quantum computer to build molecules that target an "undruggable" form of cancer. While there are still challenges with ensuring predictions are reliable, as well as with the sheer size of the datasets involved, Coveney said quantum computers can improve complex predictions. "We are already at work on real-world applications," he said.
Share
Share
Copy Link
Researchers have demonstrated that quantum computers can enhance artificial intelligence by processing large datasets more efficiently than classical computers. A breakthrough method allows data to be fed into quantum systems in smaller batches, eliminating the need for massive memory storage. Scientists predict that a 300-logical-qubit quantum computer could outperform classical systems, while real-world experiments at Germany's Leibniz Supercomputing Centre show quantum advantage already improving AI predictions.
A mathematical breakthrough by Hsin-Yuan Huang at quantum computing firm Oratomic and his colleagues suggests that quantum computers can finally deliver practical benefits to artificial intelligence applications that currently demand enormous conventional computing power
1
. The research addresses a longstanding challenge: how to input real-world data into quantum systems in ways that leverage their unique computational advantages for machine learning tasks. For years, the debate centered on whether quantum computers could handle the data-intensive work that underlies modern AI, and this new approach provides a compelling answer.
Source: CNET
The team's method eliminates a critical bottleneck that researchers previously thought made quantum machine-learning algorithms impractical. Instead of requiring impossibly large memory devices to store all data in a superposition state before processing, the new technique allows data to be fed into quantum computers in smaller batches—similar to streaming a movie rather than downloading it entirely before watching
1
. Haimeng Zhao at the California Institute of Technology notes that this memory advantage is so substantial that a quantum computer built from approximately 300 logical qubits would outperform a classical computer constructed using every atom in the observable universe.
Source: New Scientist
While 300-logical-qubit systems remain years away, Huang suggests that a 60-logical-qubit computer could plausibly be constructed by the end of the decade. At this scale, the analysis indicates notable quantum advantage over classical computers for specific tasks involving processing large datasets that AI relies upon
1
. The approach works by leveraging qubits—quantum bits that can represent both zero and one simultaneously through superposition—enabling quantum systems to perform simultaneous calculations rather than the step-by-step operations of traditional machines2
.Researchers from University College London have already demonstrated this quantum advantage in real-world conditions. Working at Germany's Leibniz Supercomputing Centre, the team published findings in Science Advances showing how combining AI with quantum computing can improve AI predictions for complex simulations
2
. Their setup connected an AI model housed on a supercomputer to a quantum computer, using it to predict how gases and liquids in a system would move and interact over extended periods—calculations relevant to climate science, medicine, and city engineering.Peter Coveney, UCL professor and study coauthor, emphasizes that "even today's noisy and error-prone quantum devices can enhance the performance of conventional machine-learning algorithms trained on data from modern supercomputers"
2
. The experimental approach handles most data processing with the supercomputer, then uses the quantum computer for one critical step involving the hardest calculations before returning control to the AI model. This hybrid strategy acknowledges that quantum computers remain incredibly sensitive—even tiny environmental disturbances throw off calculations—making them impractical for everyday use outside research labs.Adrián Pérez-Salinas at ETH Zurich notes that "the quantum machine is a very powerful device, but you do need to first feed it," and this study addresses feeding data bit by bit without overwhelming the system
1
. However, he cautions that many questions remain about applying this work to actual devices and real-world datasets. Past quantum machine-learning algorithms have sometimes been "dequantised"—adapted to work without quantum hardware while keeping excellent performance—making it important to examine how essential quantumness truly is to these new algorithms.Related Stories
Vedran Dunjko at Leiden University suggests the approach could benefit large scientific experiments like those at the Large Hadron Collider, where millions of gigabytes of data are continuously generated but most gets discarded due to insufficient computer memory
1
. While likely not applicable to the majority of current AI applications heating data centers worldwide, the technology may prove vital for specific use cases. Healthcare applications are already emerging: Google's Quantum Echoes algorithm calculated molecular structures for drug discovery in 2025, while the University of Toronto and Insilico Medicine used AI with quantum computers to build molecules targeting an "undruggable" cancer form2
.Huang's team continues expanding the types of algorithms their method supports while devising new quantum computer configurations that could process data not just with minimal memory but in practical timeframes
1
. Coveney confirms researchers are "already at work on real-world applications," though challenges remain with ensuring prediction reliability and managing massive datasets2
. The short-term focus centers on building 60-logical-qubit systems by decade's end, while longer-term implications suggest quantum computers could fundamentally reshape how machine learning handles scientific simulations across industries from pharmaceuticals to climate modeling.Summarized by
Navi
[1]
10 Jun 2025•Science and Research

21 Nov 2024•Science and Research

09 Apr 2025•Technology

1
Policy and Regulation

2
Policy and Regulation

3
Technology
