3 Sources
3 Sources
[1]
Thermodynamic Computing Promises Energy-Efficient AI Images
Representation of a coupling pattern between representative hidden units and a visible layer from an independent dynamical trajectory of Whitelam's trained denoising thermodynamic computer. Generative AI tools such as DALL-E, Midjourney, and Stable Diffusion create photorealistic images. However, they burn lavish amounts of energy. Now a pair of studies finds that so-called thermodynamic computing might generate images using one ten billionth the energy. At the heart of many AI image generators are machine learning algorithms known as diffusion models. Programmers feed the models large sets of images to which they gradually add noise until they resemble the static on an out-of-tune analog television. They then train neural networks to reverse this process, enabling diffusion models to generate entirely new images given prompting. However, the AI digital computations that add noise and then conjure pictures from the static are energy-hungry. Now a new technique involving thermodynamic computing might generate images "with a much lower energy cost than current digital hardware can," says Stephen Whitelam, a staff scientist at Lawrence Berkeley National Laboratory in California. Using nature's noise Thermodynamic computing employs physical circuits that changes in response to noise, such as that caused by random thermal fluctuations in the environment, to perform low-energy computations. For instance, a prototype chip from New York city-based startup Normal Computing consists of eight resonators connected to each other via special couplers. Programmers use the couplers to build a kind of calculator customized for the problem they want to study. Then they pluck the resonators, which introduce noise into the resonator-coupling network, performing the calculation. After the system reaches equilibrium, the programmers can read the solution in the new configuration of the resonators. In a 10 January Nature Communications article, Whitelam and a colleague revealed it was possible to create a thermodynamic version of a neural network. This lays the groundwork for image generation via thermodynamic computing. Whitelam's new strategy would give a thermodynamic computer a set of images. The technique would then allow those stored pictures to degrade by letting the natural random interactions between the computer's components run until the couplings linking these components naturally reach a state of equilibrium. Next, the strategy would compute the probability that a thermodynamic computer with a given state of couplings could reverse the decay process. Then it would adjust the values of these couplings to maximize that probability. In simulations run on conventional computers published January 20 in Physical Review Letters, Whitelam found this training process can lead to a thermodynamic computer whose settings can generate images of handwritten digits. It could accomplish this without energy-intensive digital neural networks or a noise-generating pseudorandom number generator. "This research suggests that it's possible to make hardware to do certain types of machine learning -- here, image generation -- with considerably lower energy cost than we do at present," Whitelam says. Whitelam cautions that thermodynamic computers are currently rudimentary when compared with digital neural networks. "We don't yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E," he says. "It will still be necessary to work out how to build the hardware to do this." Although he calculates that thermodynamic computers might have a huge advantage over regular computers in terms of energy efficiency, "it will be challenging to build a thermodynamic computer that can enjoy all of that advantage. It's likely that near-term designs will be something in between that ideal and current digital power levels."
[2]
'Thermodynamic computing' could slash energy use of AI image generation by a factor of ten billion, study claims -- prototypes show promise but huge task required to create hardware that can rival current models
"It will still be necessary to work out how to build the hardware to do this" A mind-bending new report claims that 'thermodynamic computing' could, in theory, drastically reduce the energy consumed by AI to generate images, using just one ten-billionth of the energy of current popular tools. As reported by IEEE Spectrum, two recent studies hint at the potential of this burgeoning technology, but its proponents admit the solution is rudimentary. According to the report, Lawrence Berkeley National Laboratory staff scientist Stephen Withelam claims thermodynamic computing could be used for AI image generation "with a much lower energy cost than current digital hardware can." In a January 10 article published by Whitelam and Corneel Casert, also of Berkeley, the pair outlined how "it was possible to create a thermodynamic version of a neural network," laying the foundations for generating images using thermodynamic computing. The world's first 'thermodynamic computing chip' reached tape out last year. Thermodynamic computing is much more akin to quantum or probabilistic computing than your traditional gaming PC, using noise and physical energy to solve problems. According to the report, the thermodynamic computer is given a set of images, and then left to let the images degrade. The natural random interactions run until equilibrium is achieved between the computer's components. The computer is then tasked with working out the probability of reversing this decay process, before adjusting the values to make that as likely as possible. Whitelam followed this research up with an article in Physical Review Letters on January 20, in which he details that this process can be used to create a thermodynamic computer that can be used to generate the image of some handwritten digits. Naturally, that's a long way off the intense image generation capabilities of Google Gemini's Nano Banana Pro, or any other AI image generator you can think of. However, it serves as a proof of concept that somehow, one day, thermodynamic computing could be used for AI image generation. "This research suggests that it's possible to make hardware to do certain types of machine learning," Whitelam told IEEE. Specifically, "image generation -- with considerably lower energy cost than we do at present." Given how rudimentary this proof of concept is, Whitelam warns that thermodynamic image generation to rival mainstream options is a long way off. "We don't yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E," he reportedly said. "It will still be necessary to work out how to build the hardware to do this." That's quite the catch, but in a world where AI buildouts and data center growth are putting unprecedented strain on global energy supply, a future process that could reduce AI image generation energy usage by a factor of ten billion would certainly be a breakthrough. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[3]
Could AI image generation consume far less energy? Research is ongoing
Scaling to complex image generation will require entirely new hardware designs and approaches Scientists are exploring a new type of computing which uses natural energy flows to potentially perform AI tasks more efficiently. Unlike traditional digital computers, which rely on fixed circuits and exact calculations, thermodynamic computing works with randomness, noise, and physical interactions to solve problems. The idea is that this method could allow AI tools, including image editors, to run using far less power than current systems. The process of thermodynamic image generation is unusual compared with normal computing. It begins with the computer receiving a set of images, which it then allows to "degrade." In this context, degrade does not mean the images are deleted or damaged; it means the data in the images is allowed to spread or change naturally due to tiny fluctuations in the system. These fluctuations are caused by the physical energy moving through the computer's components, like tiny currents and vibrations. Over time, these interactions cause the images to become blurred or noisy, creating a kind of natural disorder - then, the system measures the likelihood of reversing this disorder, adjusting its internal settings to make reconstruction more likely. By running this process many times, the computer gradually restores the original images without following the step-by-step logic used by conventional computers. Stephen Whitelam, a researcher at Lawrence Berkeley National Laboratory, has demonstrated that thermodynamic computing can produce simple images such as handwritten digits. These outputs are far simpler than those from AI image generators like DALL-E or Google Gemini's Nano Banana Pro. Still, the research proves that physical systems can perform basic machine learning tasks, showing a new way AI could work. However, scaling this approach to produce high-quality, fully featured images will require new types of hardware. Proponents claim that thermodynamic computing could reduce the energy needed for AI image generation by a factor of ten billion compared with standard computers. If successful, this would greatly reduce the energy consumption of data centers running AI models. Although the first thermodynamic computing chip has been made, current prototypes are basic and cannot match mainstream AI tools. Researchers stress that the concept is limited to basic principles, and practical implementations will require breakthroughs in both hardware and computational design. "This research suggests that it's possible to make hardware to do certain types of machine learning...with considerably lower energy cost than we do at present," Whitelam told IEEE. "We don't yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E...it will still be necessary to work out how to build the hardware to do this."
Share
Share
Copy Link
Scientists at Lawrence Berkeley National Laboratory have demonstrated that thermodynamic computing could generate AI images using one ten-billionth the energy of current tools like DALL-E and Midjourney. The breakthrough research shows promise for addressing the high energy consumption of generative AI, though significant hardware development challenges remain before the technology can rival existing models.
Generative AI image tools like DALL-E, Midjourney, and Stable Diffusion have transformed how we create visual content, but they demand enormous amounts of power. Now, groundbreaking research suggests thermodynamic computing could generate images using one ten-billionth the energy of current digital systems
1
. Stephen Whitelam, a staff scientist at Lawrence Berkeley National Laboratory, and his colleague Corneel Casert published findings in Nature Communications on January 10 demonstrating it was possible to create a thermodynamic version of neural networks2
. This lays the foundation for energy-efficient AI images that could dramatically reduce the energy consumption tied to machine learning tasks.
Source: TechRadar
Unlike traditional digital computers that rely on fixed circuits and precise calculations, thermodynamic computing employs physical circuits that respond to noise from thermal fluctuations in the environment
1
. A prototype chip from New York-based startup Normal Computing illustrates this approach: eight resonators connected through special couplers form a customizable calculator. Programmers pluck the resonators to introduce noise into the network, and as the system reaches equilibrium, the solution emerges in the new configuration1
. This method harnesses nature's randomness rather than fighting it, enabling massive energy savings compared to energy-intensive digital neural networks.Whitelam's approach to AI image generation involves giving a thermodynamic computer a set of images, then allowing them to degrade naturally as random interactions between components run until equilibrium is achieved
3
. The system then calculates the probability of reversing this decay process and adjusts coupling values to maximize that likelihood. In simulations published January 20 in Physical Review Letters, this training process successfully generated images of handwritten digits without requiring energy-intensive digital neural networks or pseudorandom number generators1
. The process resembles how diffusion models work—gradually adding noise until images resemble static on an analog television, then reversing the process—but uses physical energy flows instead of digital computations1
.
Source: IEEE
Related Stories
While the energy cost advantages appear substantial, Whitelam cautions that current prototypes remain rudimentary compared to existing systems. "We don't yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E," he told IEEE. "It will still be necessary to work out how to build the hardware to do this"
1
. The world's first thermodynamic computing chip reached tape out last year, but scaling from simple handwritten digits to the complex outputs of tools like Google Gemini's image generators requires entirely new hardware designs2
. Whitelam acknowledges that near-term designs will likely fall somewhere between the theoretical ideal and current digital power levels1
.As AI buildouts and data center growth place unprecedented strain on global energy supply, the potential to reduce the energy consumption of machine learning by such dramatic factors becomes critical
2
. "This research suggests that it's possible to make hardware to do certain types of machine learning—here, image generation—with considerably lower energy cost than we do at present," Whitelam explains3
. If successful at scale, energy-efficient neural networks based on thermodynamic principles could transform how we approach AI infrastructure, making advanced capabilities accessible without the current environmental and economic costs. The research proves that physical systems can perform basic machine learning tasks in fundamentally new ways, opening pathways for future innovation even as significant technical hurdles remain.Summarized by
Navi
08 Mar 2025•Technology

09 Oct 2024•Technology

19 Dec 2024•Science and Research

1
Business and Economy

2
Technology

3
Technology
