2 Sources
2 Sources
[1]
Optical method runs AI tensor operations at the speed of light
Tensor operations drive nearly every AI task today. GPUs handle them well, but the surge in data has exposed limits in speed, power efficiency and scalability. This pressure pushed an international team led by Dr. Yufeng Zhang of Aalto University to look beyond electronic circuits. The group has developed "single-shot tensor computing," a technique that uses the physical properties of light to process data. Light waves carry amplitude and phase. The team encoded digital information into these properties and allowed the waves to interact as they travel. That interaction performs the same mathematical operations as deep learning systems. "Our method performs the same kinds of operations that today's GPUs handle, like convolutions and attention layers, but does them all at the speed of light," says Dr. Zhang. He adds that the system avoids electronic switching because the optical operations unfold naturally during propagation. The researchers pushed this method further by adding multiple wavelengths of light. Each wavelength behaves like its own computational channel, which lets the system process higher-order tensor operations in parallel.
[2]
AI at the speed of light just became a possibility
Researchers at Aalto University have demonstrated single-shot tensor computing at the speed of light, a remarkable step towards next-generation artificial general intelligence hardware powered by optical computation rather than electronics. Tensor operations are the kind of arithmetic that form the backbone of nearly all modern technologies, especially artificial intelligence, yet they extend beyond the simple math we're familiar with. Imagine the mathematics behind rotating, slicing, or rearranging a Rubik's cube along multiple dimensions. While humans and classical computers must perform these operations step by step, light can do them all at once. Today, every task in AI, from image recognition to natural language processing, relies on tensor operations. However, the explosion of data has pushed conventional digital computing platforms, such as GPUs, to their limits in terms of speed, scalability and energy consumption. How light enables instant tensor math Motivated by this pressing problem, an international research collaboration led by Dr. Yufeng Zhang from the Photonics Group at Aalto University's Department of Electronics and Nanoengineering has unlocked a new approach that performs complex tensor computations using a single propagation of light. The result is single-shot tensor computing, achieved at the speed of light itself. The research was published in Nature Photonics. "Our method performs the same kinds of operations that today's GPUs handle, like convolutions and attention layers, but does them all at the speed of light," says Dr. Zhang. "Instead of relying on electronic circuits, we use the physical properties of light to perform many computations simultaneously." To achieve this, the researchers encoded digital data into the amplitude and phase of light waves, effectively turning numbers into physical properties of the optical field. When these light fields interact and combine, they naturally carry out mathematical operations such as matrix and tensor multiplications, which form the core of deep learning algorithms. By introducing multiple wavelengths of light, the team extended this approach to handle even higher-order tensor operations. Potential impact and future applications "Imagine you're a customs officer who must inspect every parcel through multiple machines with different functions and then sort them into the right bins," Zhang explains. "Normally, you'd process each parcel one by one. Our optical computing method merges all parcels and all machines together -- we create multiple 'optical hooks' that connect each input to its correct output. With just one operation, one pass of light, all inspections and sorting happen instantly and in parallel." Another key advantage of this method is its simplicity. The optical operations occur passively as the light propagates, so no active control or electronic switching is needed during computation. "This approach can be implemented on almost any optical platform," says Professor Zhipei Sun, leader of Aalto University's Photonics Group. "In the future, we plan to integrate this computational framework directly onto photonic chips, enabling light-based processors to perform complex AI tasks with extremely low power consumption." Ultimately, the goal is to deploy the method on the existing hardware or platforms established by major companies, says Zhang, who conservatively estimates the approach will be integrated into such platforms within three to five years. "This will create a new generation of optical computing systems, significantly accelerating complex AI tasks across a myriad of fields," he concludes.
Share
Share
Copy Link
Researchers at Aalto University have developed single-shot tensor computing using light waves, enabling AI operations to be performed at the speed of light. This optical method could revolutionize AI hardware by dramatically improving speed and energy efficiency.
Researchers at Aalto University have achieved a significant breakthrough in artificial intelligence computing by developing a method that performs tensor operations at the speed of light. The international research team, led by Dr. Yufeng Zhang from the Photonics Group at Aalto University's Department of Electronics and Nanoengineering, has demonstrated "single-shot tensor computing" that could fundamentally transform AI hardware
1
2
.Tensor operations form the mathematical backbone of nearly all modern AI technologies, from image recognition to natural language processing. However, the exponential growth in data has pushed conventional computing platforms like GPUs to their limits in terms of speed, scalability, and energy consumption
2
.
Source: Tech Xplore
The innovative approach leverages the physical properties of light waves to perform complex mathematical computations. The research team encoded digital information into the amplitude and phase of light waves, effectively converting numbers into physical properties of the optical field. When these light fields interact and combine during propagation, they naturally execute mathematical operations such as matrix and tensor multiplications that are essential for deep learning algorithms
1
2
."Our method performs the same kinds of operations that today's GPUs handle, like convolutions and attention layers, but does them all at the speed of light," explains Dr. Zhang. The system avoids the need for electronic switching because the optical operations occur naturally during light propagation
1
.To enhance computational capacity, the researchers incorporated multiple wavelengths of light, with each wavelength functioning as its own computational channel. This allows the system to process higher-order tensor operations in parallel, significantly expanding its processing capabilities
1
.Dr. Zhang illustrates the method's efficiency with an analogy: "Imagine you're a customs officer who must inspect every parcel through multiple machines with different functions and then sort them into the right bins. Normally, you'd process each parcel one by one. Our optical computing method merges all parcels and all machines together -- we create multiple 'optical hooks' that connect each input to its correct output. With just one operation, one pass of light, all inspections and sorting happen instantly and in parallel"
2
.A key advantage of this optical approach is its simplicity and passive operation. Unlike traditional electronic systems that require active control and switching, the optical computations occur naturally as light propagates through the system. This characteristic makes the method implementable on almost any optical platform
2
.Related Stories
Professor Zhipei Sun, leader of Aalto University's Photonics Group, envisions integrating this computational framework directly onto photonic chips, which would enable light-based processors to perform complex AI tasks with extremely low power consumption
2
.Dr. Zhang conservatively estimates that this optical computing approach will be integrated into platforms established by major technology companies within three to five years. The research, published in Nature Photonics, represents a crucial step toward next-generation artificial general intelligence hardware powered by optical computation rather than traditional electronics
2
.
Source: Interesting Engineering
Summarized by
Navi
[1]
[2]
03 Dec 2024•Technology

08 Feb 2025•Technology

10 Apr 2025•Technology

1
Technology

2
Technology

3
Business and Economy
