The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Sat, 29 Mar, 12:03 AM UTC
3 Sources
[1]
Advancing semiconductor devices for AI: Single transistor acts like neuron and synapse
Researchers from the National University of Singapore (NUS) have demonstrated that a single, standard silicon transistor, the fundamental building block of microchips used in computers, smartphones and almost every electronic system, can function like a biological neuron and synapse when operated in a specific, unconventional way. Led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering at the College of Design and Engineering, NUS, the research team's work presents a highly scalable and energy-efficient solution for hardware-based artificial neural networks (ANNs). The world's most sophisticated computers already exist inside our heads. Studies show that the human brain is, by and large, more energy-efficient than electronic processors, thanks to almost 90 billion neurons that form some 100 trillion connections with each other, and synapses that tune their strength over time -- a process known as synaptic plasticity, which underpins learning and memory. For decades, scientists have sought to replicate this efficiency using artificial neural networks (ANNs). ANNs have recently driven remarkable advances in artificial intelligence (AI), loosely inspired by how the brain processes information. But while they borrow biological terminology, the similarities run only skin deep -- software-based ANNs, such as those powering large language models like ChatGPT, have a voracious appetite for computational resources, and hence, electricity. This makes them impractical for many applications. Neuromorphic computing aims to mimic the computing power and energy efficiency of the brain. This requires not only redesigning system architecture to carry out memory and computation at the same place -- the so-called in-memory computing (IMC) -- but also to develop electronic devices that exploit physical and electronic phenomena capable of replicating more faithfully how neurons and synapses work. However, current neuromorphic computing systems are stymied by the need for complicated multi-transistor circuits or emerging materials that are yet to be validated for large-scale manufacturing. "To enable true neuromorphic computing, where microchips behave like biological neurons and synapses, we need hardware that is both scalable and energy-efficient," said Professor Lanza. The NUS research team has now demonstrated that a single, standard silicon transistor, when arranged and operated in a specific way, can replicate both neural firing and synaptic weight changes -- the fundamental mechanisms of biological neurons and synapses. This was achieved through adjusting the resistance of the bulk terminal to specific values, which allows controlling two physical phenomena taking place in the transistor: punch through impact ionization and charge trapping. Moreover, the team built a two-transistor cell capable of operating either in a neuron or a synaptic regime, which the researchers have called "Neuro-Synaptic Random Access Memory," or NS-RAM. "Other approaches require complex transistor arrays or novel materials with uncertain manufacturability, but our method makes use of commercial CMOS (complementary metal-oxide-semiconductor) technology, the same platform found in modern computer processors and memory microchips," explained Professor Lanza. "This means it's scalable, reliable and compatible with existing semiconductor fabrication processes." Through experiments, the NS-RAM cell demonstrated low power consumption, maintained stable performance over many cycles of operation and exhibited consistent, predictable behavior across different devices -- all of which are desired attributes for building reliable ANN hardware suited for real-world applications. The team's breakthrough marks a step change in the development of compact, power-efficient AI processors that could enable faster, more responsive computing.
[2]
Redefining the transistor: The ideal building block for artificial intelligence | Newswise
Associate Professor Mario Lanza and his team demonstrated a groundbreaking silicon transistor that mimics neural and synaptic behaviours, marking a significant breakthrough in neuromorphic computing. The team led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering in the College of Design and Engineering at the National University of Singapore, has just revolutionised the field of neuromorphic computing by inventing a new super-efficient computing cell that can mimic the behaviour of both electronic neurons and synapses. The work, titled "Synaptic and neural behaviours in a standard silicon transistor" was published in the scientific journal Nature on 26 March 2025 and is already attracting interest from leading companies in the semiconductor field. Electronic neurons and synapses are the two fundamental building blocks of next-generation artificial neural networks. Unlike traditional computers, these systems process and store data in the same place, eliminating the need to waste time and energy transferring data from memory to the processing unit (CPU). The problem is that implementing electronic neurons and synapses with traditional silicon transistors requires interconnecting multiple devices -- specifically, at least 18 transistors per neuron and 6 per synapse. This makes them significantly larger and more expensive than a single transistor. The team led by Professor Lanza has found an ingenious way to reproduce the electronic behaviours characteristic of neurons and synapses in a single conventional silicon transistor. The key lies in setting the resistance of the bulk terminal to a specific value to produce a physical phenomenon called "impact ionisation," which generates a current spike very similar to what happens when an electronic neuron is activated. Additionally, by setting the bulk resistance to other specific values, the transistor can store charge in the gate oxide, causing the resistance of the transistor to persist over time, mimicking the behaviour of an electronic synapse. Making the transistor operate as a neuron or synapse is as simple as selecting the appropriate resistance for the bulk terminal. The physical phenomenon of "impact ionisation" had traditionally been considered a failure mechanism in silicon transistors, but Professor Lanza's team has managed to control it and turn it into a highly valuable application for the industry. This discovery is revolutionary because it allows the size of electronic neurons to be reduced by a factor of 18 and that of synapses by a factor of 6. Considering that each artificial neural network contains millions of electronic neurons and synapses, this could represent a huge leap forward in computing systems capable of processing much more information while consuming far less energy. Furthermore, the team has designed a cell with two transistors -- called Neuro-Synaptic Random Access Memory (NSRAM) -- that allows switching between operating modes (neuron or synapse), offering great versatility in manufacturing since both functions can be reproduced using a single block, without the need to dope the silicon to achieve specific substrate resistance values. The transistors used by Professor Lanza's team to implement these advanced neurons and synapses are not cutting-edge transistors like those manufactured in Taiwan or Korea, but rather traditional 180-nanometer node transistors, which can be produced by Singapore-based companies. According to Professor Lanza, "once the operating mechanism is discovered, it's now more a matter of microelectronic design". The first author of the paper, Dr Sebastián Pazos, who is from King Abdullah University of Science and Technology, commented, "Traditionally, the race for supremacy in semiconductors and artificial intelligence has been a matter of brute force, seeing who could manufacture smaller transistors and bear the production costs that come with it. Our work proposes a radically different approach based on exploiting a computing paradigm using highly efficient electronic neurons and synapses. This discovery is a way to democratise nanoelectronics and enable everyone to contribute to the development of advanced computing systems, even without access to cutting-edge transistor fabrication processes."
[3]
Scientists trying to merge human neurons with semiconductors, human brain to power the future
TL;DR: Scientists at the National University of Singapore have developed a silicon transistor that mimics biological neurons and synapses, offering a scalable and energy-efficient solution for artificial neural networks. This advancement in neuromorphic computing uses commercial CMOS technology, potentially revolutionizing chip efficiency by emulating the human brain's processing capabilities. Scientists are working on a new level of technology that uses the neurons inside of the human brain, merging them with semiconductors (chips) to create something that's no longer constrained by the physical limitations of a microchip. Researchers from the National University of Singapore (NUS) have showed off a single, standard silicon transistor that can function like a biological neuron and synapse when operated in a specific, unconventional way. The research team has presented its work as a highly scalable and energy-efficient solution for hardware-based artificial neuron networks (ANNs). The human brain is an amazing piece of art as it is, with studies showing that the human brain is far more energy-efficient than electronic processors with almost 90 billion neurons that form around 100 trillion connections with each other, and synapses that tune their strength as time goes by, something called synaptic plasticity, which underpins learning and memory. Scientists have tried to replicate the efficiency of the human brain using artificial neural networks (ANNs) for decades now, but ANNs have recently had huge advances in AI, inspired by how the brain processes information. This is where neuromorphic computing comes into play, where a future of chips processing information more efficiently, kinda like the human brain, closer to reality. Led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering at the College of Design and Engineering, NUS, which posted a study in journal Nature. Professor Lanza explains: "To enable true neuromorphic computing, where microchips behave like biological neurons and synapses, we need hardware that is both scalable and energy-efficient. Other approaches require complex transistor arrays or novel materials with uncertain manufacturability, but our method makes use of commercial CMOS (complementary metal-oxide-semiconductor) technology, the same platform found in modern computer processors and memory microchips. This means it's scalable, reliable and compatible with existing semiconductor fabrication processes".
Share
Share
Copy Link
Researchers at the National University of Singapore have developed a revolutionary silicon transistor that can function like both a neuron and a synapse, potentially transforming the field of neuromorphic computing and AI hardware efficiency.
Researchers from the National University of Singapore (NUS) have achieved a significant breakthrough in the field of neuromorphic computing. Led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering, the team has demonstrated that a single, standard silicon transistor can function like both a biological neuron and synapse when operated in a specific, unconventional manner 1.
The research team has developed a two-transistor cell capable of operating in either a neuron or synaptic regime, which they have named "Neuro-Synaptic Random Access Memory" or NS-RAM 1. This innovation allows for the replication of both neural firing and synaptic weight changes - the fundamental mechanisms of biological neurons and synapses - in a single device 2.
The key to this breakthrough lies in adjusting the resistance of the bulk terminal to specific values, which enables the control of two physical phenomena in the transistor: punch through impact ionization and charge trapping 1. This approach allows for a significant reduction in the size of electronic neurons and synapses, potentially by a factor of 18 and 6 respectively 2.
Unlike other approaches that require complex transistor arrays or novel materials with uncertain manufacturability, this method utilizes commercial CMOS (complementary metal-oxide-semiconductor) technology 3. This makes it scalable, reliable, and compatible with existing semiconductor fabrication processes, potentially revolutionizing the development of AI hardware 1.
Through experiments, the NS-RAM cell has demonstrated low power consumption, maintained stable performance over many cycles of operation, and exhibited consistent, predictable behavior across different devices 1. These attributes make it highly suitable for building reliable artificial neural network (ANN) hardware for real-world applications.
This breakthrough marks a significant step towards the development of compact, power-efficient AI processors that could enable faster, more responsive computing 1. By mimicking the efficiency of the human brain more closely, this technology has the potential to address the high computational resource and electricity demands of current software-based ANNs 2.
The discovery is already attracting interest from leading companies in the semiconductor field 2. As Professor Lanza notes, "once the operating mechanism is discovered, it's now more a matter of microelectronic design" 2. This breakthrough could potentially democratize nanoelectronics and enable broader contributions to the development of advanced computing systems, even without access to cutting-edge transistor fabrication processes 2.
Reference
[1]
[2]
Researchers at the Indian Institute of Science (IISc) Bengaluru have created a groundbreaking 'brain-on-a-chip' technology that mimics human brain functions. This innovation promises to revolutionize computing and artificial intelligence applications.
5 Sources
5 Sources
A comprehensive review published in Nature outlines the path to scale up neuromorphic computing, aiming to rival current computing methods in efficiency and effectiveness for AI applications.
3 Sources
3 Sources
Scientists have developed a microscopic AI chip that uses light to process data from fiber-optic cables, promising faster computations with significantly less energy consumption than traditional electronic systems.
3 Sources
3 Sources
Researchers at Tokyo University of Science have developed a groundbreaking dye-sensitized solar cell-based device that mimics human synaptic behavior, offering efficient edge AI processing for various applications while consuming significantly less power.
2 Sources
2 Sources
Scientists at the Chinese University of Hong Kong have created a laser-based artificial neuron that processes data a billion times faster than biological neurons, potentially transforming AI and computing with its speed and energy efficiency.
4 Sources
4 Sources