3 Sources
[1]
We might finally know how to use quantum computers to boost AI
Quantum computers might eventually be able to handle some AI applications that currently require huge amounts of conventional computing power. Such a development would be a major boost to machine learning and similar artificial intelligence algorithms. Quantum computers hold the promise of eventually being able to complete certain calculations that are impossible for conventional computers. For years, researchers have been debating whether these advantages over conventional computers extend to tasks that involve lots of data, and the algorithms that learn from them - in other words, the machine learning that underlies many AI programs. Now, Hsin-Yuan Huang at the quantum computing firm Oratomic and his colleagues argue that the answer ought to be "yes". Their mathematical work aims to lay the foundations for a future where quantum computers offer a broad boost to AI. "Machine learning is really utilised everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there's massive datasets available," he says. His team's work addresses the key question of how data collected in the non-quantum world, such as restaurant reviews or results from sequencing RNA, could be input into a quantum computer in such a way that the computer's quantumness can be leveraged to process the data, and learn from it, more efficiently. This requires putting all of the data into a "superposition state", which is a mathematical combination that cannot be created in non-quantum machines. But until now, researchers thought that performing this task would be impractical. This is because they assumed that all of the data in that superposition state would have to be saved into dedicated memory devices prior to being processed by the quantum computer - but those memory devices would have had to be impossibly large, says team member Haimeng Zhao at the California Institute of Technology. Huang and his colleagues took a different approach that doesn't require such memories. It involves inputting the data into the quantum computer in smaller batches, without having to save it all before beginning to process it, similar to streaming a movie rather than downloading it in full prior to watching it. They showed not only that this approach can work but that it would allow the quantum computer to process more data at a smaller memory cost than any conventional computer. The memory advantage is so large, in fact, that a quantum computer made from about 300 error-proof building blocks called logical qubits would outperform a classical computer built using every atom in the observable universe, says Zhao. We are maybe many years away from building quantum computers with 300 logical qubits, but Huang says that a 60-logical-qubit computer could plausibly be built by the end of the decade. The team's analysis suggests that, at this size, there would already be a notable quantum advantage over classical computers for some tasks that involve processing large datasets and that AI is used for. "The quantum machine is a very powerful device, but you do need to first feed it. This study talks about feeding and how it's enough to load [data] bit by bit, without overfeeding the beast," says Adrián Pérez-Salinas at ETH Zurich in Switzerland. Nevertheless, he says that many questions about applying the new work to actual devices and real-world data still need to be addressed. Many past quantum machine-learning algorithms were eventually shown to be amenable to "dequantisation", which is a process where the the algorithms were adapted to no longer require any quantum hardware while retaining their excellent performance. It will be important to examine how crucial quantumness is to this new algorithm too, says Pérez-Salinas. Vedran Dunjko at Leiden University in the Netherlands says that the new work could be a good match for large scientific experiments such as at the Large Hadron Collider where millions of gigabytes of data are continuously created, but most of it gets discarded because of insufficient computer memory. But it's likely that only some current AI applications and similar kinds of data processing will be amenable to being handled with a quantum computer rather than with a data centre full of conventional servers, he says. "This is not the majority of what GPUs are heating up the planet for, but may still be important," says Dunjko. The researchers are now working on both expanding the kind of algorithms that their method could be useful for and devising new ways to configure quantum computers that would make them sufficiently fast to handle data not just with very little memory but in a practical amount of time.
[2]
Researchers Use Quantum Computer to Improve AI Predictions
Julian is a contributor and former staff writer at CNET. He's covered a range of topics, such as tech, crypto travel, sports and commerce. His past work has appeared at print and online publications, including New Mexico Magazine, TV Guide, Mental Floss and NextAdvisor with TIME. On his days off, you can find him at Isotopes Park in Albuquerque watching the ballgame. AI models have been helping with predictions for a while now. Doctors, weather forecasters and stock brokers all use AI to try to peek into the future. Inside the Leibniz Supercomputing Centre in Germany, researchers have been experimenting with an AI model and a quantum computer. The quantum computer helps the AI with complex predictions it can't handle alone. The research team from University College London, who published their findings on Friday in the journal Science Advances, say that one day, quantum computers could help AI models make fast, accurate predictions across a range of industries, which would take regular computers weeks to figure out. "The paper demonstrates that for these kinds of studies, even today's relatively small and unreliable quantum devices can enhance the predictions of conventional AI models," Peter Coveney, UCL professor and the study's coauthor, told CNET. Quantum computers differ from regular computers in several ways, including being able to perform simultaneous calculations rather than step-by-step calculations, and using quantum bits. While classical computers use bits as the smallest data unit, with each representing either a zero or a one, qubits can represent both zero and one simultaneously (superposition). Two qubits can also be linked together (entanglement). Superposition and entanglement allow quantum computers to solve complex problems much faster than traditional computers. But quantum computers are incredibly delicate and must be kept at extremely low temperatures, making them impractical for everyday use. But while today's quantum computers are still experimental and often finicky, they might help AI solve big problems that would otherwise be too complicated or time-consuming. The AI model used in the study is housed on a supercomputer connected to the quantum computer at the research center. The team used this setup to predict how gases and liquids in a system would move and interact over an extended period. Climate science, medicine and city engineering all use this kind of modeling. "Our new method appears to demonstrate 'quantum advantage' in a practical way -- that is, the quantum computer outperforms what is possible through classical computing alone," coauthor Maida Wang, a PHD student at UCL, said in an announcement. Quantum computers are incredibly sensitive. Even tiny disturbances in the environment throw off the calculations, so the technology is still mostly used in research labs. Because quantum computing is still limited, the researchers did most of the study with the supercomputer. The AI model handled the data processing, then used the quantum computer for one step. After completing the hard calculations, the quantum computer handed the reins back to the AI model, so it could take care of everything else. "Even today's noisy and error-prone quantum devices can enhance the performance of conventional machine-learning algorithms trained on data from modern supercomputers," Coveney said. Hooking up an AI model to run calculations on a quantum computer might sound outlandish, but there are already real examples of companies using this approach in healthcare. In 2025, Google said its Quantum Echoes algorithm could calculate the structure of molecules that could pave the way for future drug discovery. Also, last year, the University of Toronto and Insilico Medicine used AI with a quantum computer to build molecules that target an "undruggable" form of cancer. While there are still challenges with ensuring predictions are reliable, as well as with the sheer size of the datasets involved, Coveney said quantum computers can improve complex predictions. "We are already at work on real-world applications," he said.
[3]
Quantum Computers Could Boost AI by Processing Large Datasets More Efficiently - Decrypt
Even relatively small quantum computers could show advantages for certain data-heavy tasks. Quantum computers may eventually help process some of the massive datasets used to train artificial intelligence, according to a report by New Scientist. Drawing from an earlier study by Caltech, Google Quantum AI, quantum computing startup Oratomic, and MIT, researchers say one challenge has been getting large datasets -- often measured in terabytes or petabytes -- into a quantum computer. To use quantum effects, data must be converted into a quantum state, and preparing those states has traditionally required significant quantum memory. "Machine learning is really utilized everywhere in science and technology, and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there's massive datasets available," Hsin-Yuan Huang, CTO at Oratomic, said in a statement. The study proposes that, rather than requiring the full dataset to be loaded into quantum memory first, the new method prepares the necessary quantum states during processing, reducing the memory burden. The researchers say this could allow quantum effects such as superposition to be used without extremely large storage systems. The researchers say the approach could also allow quantum computers to process large datasets while using less memory than conventional systems, suggesting that a machine with about 300 logical qubits -- error-corrected quantum bits that can reliably perform calculations -- could outperform classical computers on certain tasks. Such a system does not yet exist; however, the researchers estimate that a quantum computer with roughly 60 logical qubits could begin outperforming classical systems on some data-processing tasks used in artificial intelligence, highlighting how advances in quantum computing could threaten fields such as cryptography and blockchain. "People are used to quantum computers always being 10 years away," Oratomic co-founder and CEO Dolev Bluvstein previously told Decrypt. "But when you look at where we were a little over ten years ago, the best estimates of what would be required for Shor's algorithm were one billion qubits at a time when the best systems we had in the lab were roughly five qubits." Still, researchers say the connection between artificial intelligence and quantum computing is growing closer, as AI tools help scientists analyze and model complex quantum systems that would otherwise be difficult to simulate, accelerating work on quantum hardware and applications. "The quantum machine is a very powerful device, but you do need to first feed it," Professor of Computational Physics at ETH Zurich in Switzerland, Adrián Pérez-Salinas, said in a statement. "This study talks about feeding and how it's enough to load [data] bit by bit, without overfeeding the beast."
Share
Copy Link
Researchers have developed a breakthrough method for using quantum computers to boost AI performance on massive datasets. The technique allows quantum machines to process data in smaller batches without requiring impossibly large memory systems. A quantum computer with just 300 logical qubits could outperform a classical computer built using every atom in the observable universe for certain AI tasks.
Researchers have identified a practical path forward for using quantum computers to boost AI capabilities, potentially solving one of machine learning's biggest challenges: processing enormous datasets efficiently. The breakthrough centers on a new method that allows quantum computers to handle data-intensive artificial intelligence (AI) tasks without the impossibly large memory requirements previously thought necessary
1
.
Source: Decrypt
Hsin-Yuan Huang at quantum computing firm Oratomic and his colleagues have developed an approach that fundamentally changes how large datasets can be fed into quantum systems. "Machine learning is really utilized everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there's massive datasets available," Huang explained
3
. The work addresses a critical question that has puzzled researchers for years: whether quantum advantages extend to the data-heavy tasks that underpin modern AI programs.The key innovation involves processing large datasets without storing everything in quantum memory first. Traditional approaches assumed all data would need to be placed into a superposition state—a mathematical combination unique to quantum systems—and saved into dedicated memory devices before processing could begin. Those memory devices would have been impractically large, according to team member Haimeng Zhao at the California Institute of Technology
1
.
Source: New Scientist
Instead, the team developed a technique that inputs data into quantum computers in smaller batches, similar to streaming a movie rather than downloading it entirely before watching. This approach prepares the necessary quantum state during processing rather than requiring the full dataset to be loaded first
3
. The researchers demonstrated that this method allows quantum computers to process more data at a smaller memory cost than any conventional computer, delivering a substantial memory advantage for processing large datasets more efficiently.The memory advantage is so significant that a quantum computer built from approximately 300 error-proof building blocks called logical qubits would outperform a classical computer constructed using every atom in the observable universe, says Zhao
1
. While such systems may be years away, the team's analysis suggests a more modest 60-logical-qubit computer could plausibly be built by the end of the decade and would already show notable quantum advantage for AI on some data processing tasks1
.
Source: CNET
Separate research from University College London demonstrated practical applications of the synergy between quantum computing and AI. Their team connected an AI model housed on a supercomputer to a quantum computer at the Leibniz Supercomputing Centre in Germany, using the setup to improve AI predictions about how gases and liquids in a system would move and interact over extended periods
2
. "The paper demonstrates that for these kinds of studies, even today's relatively small and unreliable quantum devices can enhance the predictions of conventional AI models," said UCL professor Peter Coveney.Related Stories
The potential applications span multiple industries where algorithms process massive amounts of information. Vedran Dunjko at Leiden University in the Netherlands notes the approach could benefit large scientific experiments like those at the Large Hadron Collider, where millions of gigabytes of data are continuously created but most gets discarded due to insufficient computer memory
1
.Real-world implementations are already underway. In 2025, Google announced its Quantum Echoes algorithm could calculate molecular structures for drug discovery. The University of Toronto and Insilico Medicine used AI with quantum computers to build molecules targeting an "undruggable" form of cancer
2
. Coveney confirmed his team is "already at work on real-world applications" that leverage quantum machine-learning algorithms.Adrián Pérez-Salinas at ETH Zurich in Switzerland offered a vivid analogy: "The quantum machine is a very powerful device, but you do need to first feed it. This study talks about feeding and how it's enough to load [data] bit by bit, without overfeeding the beast"
3
. However, he notes many questions about applying the new work to actual devices and real-world data still need addressing.One concern involves "dequantisation"—a process where past quantum machine-learning algorithms were adapted to work without quantum hardware while retaining excellent performance. Researchers will need to examine how crucial quantumness is to these new algorithms
1
. Dunjko suggests that while not all current AI applications will benefit from quantum processing, the impact could still be significant: "This is not the majority of what GPUs are heating up the planet for, but may still be important."The researchers continue expanding the types of algorithms their method supports and devising new ways to configure qubits that would make quantum computers sufficiently fast to handle data not just with minimal memory but in practical timeframes
1
.Summarized by
Navi
[1]
10 Jun 2025•Science and Research

21 Nov 2024•Science and Research

09 Apr 2025•Technology

1
Business and Economy

2
Technology

3
Technology
