3 Sources
[1]
Robotic eyes mimic human vision for superfast response to extreme lighting
In blinding bright light or pitch-black dark, our eyes can adjust to extreme lighting conditions within a few minutes. The human vision system, including the eyes, neurons, and brain, can also learn and memorize settings to adapt faster the next time we encounter similar lighting challenges. In an article published in Applied Physics Letters, researchers at Fuzhou University in China created a machine vision sensor that uses quantum dots to adapt to extreme changes in light far faster than the human eye can -- in about 40 seconds -- by mimicking eyes' key behaviors. Their results could be a game changer for robotic vision and autonomous vehicle safety. "Quantum dots are nano-sized semiconductors that efficiently convert light to electrical signals," said author Yun Ye. "Our innovation lies in engineering quantum dots to intentionally trap charges like water in a sponge then release them when needed -- similar to how eyes store light-sensitive pigments for dark conditions." The sensor's fast adaptive speed stems from its unique design: lead sulfide quantum dots embedded in polymer and zinc oxide layers. The device responds dynamically by either trapping or releasing electric charges depending on the lighting, similar to how eyes store energy for adapting to darkness. The layered design, together with specialized electrodes, proved highly effective in replicating human vision and optimizing its light responses for the best performance. "The combination of quantum dots, which are light-sensitive nanomaterials, and bio-inspired device structures allowed us to bridge neuroscience and engineering," Ye said. Not only is their device design effective at dynamically adapting for bright and dim lighting, but it also outperforms existing machine vision systems by reducing the large amount of redundant data generated by current vision systems. "Conventional systems process visual data indiscriminately, including irrelevant details, which wastes power and slows computation," Ye said. "Our sensor filters data at the source, similar to the way our eyes focus on key objects, and our device preprocesses light information to reduce the computational burden, just like the human retina." In the future, the research group plans to further enhance their device with systems involving larger sensor arrays and edge-AI chips, which perform AI data processing directly on the sensor, or using other smart devices in smart cars for further applicability in autonomous driving. "Immediate uses for our device are in autonomous vehicles and robots operating in changing light conditions like going from tunnels to sunlight, but it could potentially inspire future low-power vision systems," Ye said. "Its core value is enabling machines to see reliably where current vision sensors fail."
[2]
New robot eyes respond to blinding light 5 times faster than humans
"The combination of quantum dots, which are light-sensitive nanomaterials, and bio-inspired device structures allowed us to bridge neuroscience and engineering," said Yun Ye, the author. In recent years, significant tech advancements have been made in giving robots their vision For example, last year, University of Pennsylvania researchers developed PanoRadar, a system that uses radio signals and AI to convert basic radio waves into detailed 3D views. It could enable robots to "see" beyond typical sensor limitations and improve upon conventional low-resolution radar. This new development, however, offers a unique approach to robotic vision. Our eyes are remarkable, quickly adapting to extreme light changes, from darkness to bright sunlight or vice versa. And not only that, they learn, adjusting even faster the next time. This remarkable ability of human vision, involving our eyes, neurons, and brain, allows us to see clearly no matter the conditions. Now, researchers aim to bring this ability to machines, particularly next-generation robots.
[3]
Robotic Eyes Mimic Human Vision for Superfast Response to Extreme Lighting | Newswise
Newswise -- WASHINGTON, July 1, 2025 -- In blinding bright light or pitch-black dark, our eyes can adjust to extreme lighting conditions within a few minutes. The human vision system, including the eyes, neurons, and brain, can also learn and memorize settings to adapt faster the next time we encounter similar lighting challenges. In an article published this week in Applied Physics Letters, by AIP Publishing, researchers at Fuzhou University in China created a machine vision sensor that uses quantum dots to adapt to extreme changes in light far faster than the human eye can -- in about 40 seconds -- by mimicking eyes' key behaviors. Their results could be a game changer for robotic vision and autonomous vehicle safety. "Quantum dots are nano-sized semiconductors that efficiently convert light to electrical signals," said author Yun Ye. "Our innovation lies in engineering quantum dots to intentionally trap charges like water in a sponge then release them when needed -- similar to how eyes store light-sensitive pigments for dark conditions." The sensor's fast adaptive speed stems from its unique design: lead sulfide quantum dots embedded in polymer and zinc oxide layers. The device responds dynamically by either trapping or releasing electric charges depending on the lighting, similar to how eyes store energy for adapting to darkness. The layered design, together with specialized electrodes, proved highly effective in replicating human vision and optimizing its light responses for the best performance. "The combination of quantum dots, which are light-sensitive nanomaterials, and bio-inspired device structures allowed us to bridge neuroscience and engineering," Ye said. Not only is their device design effective at dynamically adapting for bright and dim lighting, but it also outperforms existing machine vision systems by reducing the large amount of redundant data generated by current vision systems. "Conventional systems process visual data indiscriminately, including irrelevant details, which wastes power and slows computation," Ye said. "Our sensor filters data at the source, similar to the way our eyes focus on key objects, and our device preprocesses light information to reduce the computational burden, just like the human retina." In the future, the research group plans to further enhance their device with systems involving larger sensor arrays and edge-AI chips, which perform AI data processing directly on the sensor, or using other smart devices in smart cars for further applicability in autonomous driving. "Immediate uses for our device are in autonomous vehicles and robots operating in changing light conditions like going from tunnels to sunlight, but it could potentially inspire future low-power vision systems," Ye said. "Its core value is enabling machines to see reliably where current vision sensors fail." ### The article "A back-to-back structured bionic visual sensor for adaptive perception" is authored by Xing Lin, Zexi Lin, Wenxiao Zhao, Sheng Xu, Enguo Chen, Tailiang Guo, and Yun Ye. It will appear in Applied Physics Letters on July 1, 2025 (DOI: 10.1063/5.0268992). After that date, it can be accessed at https://doi.org/10.1063/5.0268992. ABOUT THE JOURNAL Applied Physics Letters features rapid reports on significant discoveries in applied physics. The journal covers new experimental and theoretical research on applications of physics phenomena related to all branches of science, engineering, and modern technology. See https://pubs.aip.org/aip/apl.
Share
Copy Link
Researchers at Fuzhou University have developed a machine vision sensor using quantum dots that can adapt to extreme lighting changes faster than the human eye, potentially revolutionizing robotic vision and autonomous vehicle safety.
Researchers at Fuzhou University in China have developed a groundbreaking machine vision sensor that mimics the human eye's ability to adapt to extreme lighting conditions. This innovative technology, which utilizes quantum dots, can adjust to drastic changes in light approximately five times faster than the human eye – in just about 40 seconds 1.
Source: Phys.org
The sensor's remarkable adaptability stems from its unique design, which incorporates lead sulfide quantum dots embedded in polymer and zinc oxide layers. These quantum dots, which are nano-sized semiconductors, efficiently convert light into electrical signals 2.
"Our innovation lies in engineering quantum dots to intentionally trap charges like water in a sponge then release them when needed -- similar to how eyes store light-sensitive pigments for dark conditions," explained Yun Ye, one of the study's authors 1.
Source: Interesting Engineering
The device's structure is bio-inspired, replicating key behaviors of the human vision system. It responds dynamically by either trapping or releasing electric charges depending on the lighting conditions, much like how human eyes store energy for adapting to darkness 3.
"The combination of quantum dots, which are light-sensitive nanomaterials, and bio-inspired device structures allowed us to bridge neuroscience and engineering," Ye stated 2.
This new sensor not only adapts quickly to extreme lighting conditions but also outperforms existing machine vision systems in data processing. Unlike conventional systems that indiscriminately process visual data, including irrelevant details, this sensor filters data at the source 1.
"Our sensor filters data at the source, similar to the way our eyes focus on key objects, and our device preprocesses light information to reduce the computational burden, just like the human retina," Ye explained 3.
The immediate applications for this technology are in autonomous vehicles and robots operating in changing light conditions, such as transitioning from tunnels to sunlight. However, its potential extends beyond these use cases, possibly inspiring future low-power vision systems 1.
The research team plans to further enhance their device with larger sensor arrays and edge-AI chips, which perform AI data processing directly on the sensor. They are also exploring its use in smart devices for autonomous driving 3.
"Its core value is enabling machines to see reliably where current vision sensors fail," Ye concluded, highlighting the technology's potential to revolutionize machine vision capabilities 1.
Summarized by
Navi
[2]
Databricks raises $1 billion in a new funding round, valuing the company at over $100 billion. The data analytics firm plans to invest in AI database technology and an AI agent platform, positioning itself for growth in the evolving AI market.
11 Sources
Business
14 hrs ago
11 Sources
Business
14 hrs ago
SoftBank makes a significant $2 billion investment in Intel, boosting the chipmaker's efforts to regain its competitive edge in the AI semiconductor market.
22 Sources
Business
22 hrs ago
22 Sources
Business
22 hrs ago
OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.
15 Sources
Technology
22 hrs ago
15 Sources
Technology
22 hrs ago
Microsoft introduces a new AI-powered 'COPILOT' function in Excel, allowing users to perform complex data analysis and content generation using natural language prompts within spreadsheet cells.
8 Sources
Technology
14 hrs ago
8 Sources
Technology
14 hrs ago
Adobe launches Acrobat Studio, integrating AI assistants and PDF Spaces to transform document management and collaboration, marking a significant evolution in PDF technology.
10 Sources
Technology
14 hrs ago
10 Sources
Technology
14 hrs ago