Princeton's 3D Neural Network Device Merges Living Brain Cells With Electronics for Biocomputing

2 Sources

Share

Princeton researchers have developed a 3D neural network device that integrates living brain cells with advanced electronics, creating a programmable biological neural network. The device uses a flexible microscopic metal mesh scaffold to culture tens of thousands of neurons, enabling six-month monitoring and pattern recognition while consuming just one-millionth the power of conventional AI systems.

Princeton Creates Programmable 3D Neural Network Device

Researchers at Princeton University have developed a groundbreaking 3D neural network device that bridges the gap between biological systems and silicon-based computing. Led by Tian-Ming Fu, assistant professor of Electrical and Computer Engineering, along with James Sturm and Kumar Mritunjay, the team created a device that integrates neurons and electronics into a single programmable unit capable of pattern recognition and biocomputing tasks

2

.

Unlike previous "brain-on-a-chip" attempts that relied on flat surfaces or external probing of 3D clusters, this 3D bio-hybrid device works from the inside out. The team fabricated a microscopic metal mesh scaffold using advanced techniques, with electrodes supported by a thin epoxy coating that provides optimal flexibility to interface with soft neurons

2

. This 3D flexible electronic sensor and stimulator array allows tens of thousands of neurons to grow around and through the sensors, forming a vast biological neural network

1

.

Source: Neuroscience News

Source: Neuroscience News

Long-Term Monitoring Reveals Evolving Neuronal Connectivity

The device can record action potentials from multiple planes over a period of 6 months, enabling quantitative monitoring of evolving connectivity maps and pharmacological stimulation responses

1

. This extended tracking capability represents a significant advance over previous approaches, allowing researchers to observe neural development and how connections strengthen or weaken over time. The integrated approach enables recording and stimulation of electrical activity at a much finer scale than past methods

2

.

Training Neural Networks Through Chronic Electrical Stimulation

The device supports chronic electrical stimulation, which researchers used to train neural networks by tuning connectivity strengths between neurons

1

. This training of neural networks created a reservoir neural network capable of performing computational tasks. In testing, the team used pairs of distinct spatial patterns in one experiment and distinct temporal patterns in another. The system correctly distinguished among patterns in both tests

2

. The researchers developed an algorithm that could recognize patterns of electrical pulses by strategically strengthening and weakening connections between key neurons.

Brain-Inspired Computing Tackles AI Energy Crisis

While initially developed to study fundamental neuroscience problems, the research addresses a critical bottleneck in artificial intelligence: energy consumption. "The real bottleneck for AI in the near future is energy," Fu explained, noting that the human brain consumes only one-millionth of the power used by today's AI systems to perform similar tasks

2

. This positions brain-inspired computing as a potential pathway toward energy-efficient AI systems that could dramatically reduce the computational costs of modern machine learning.

Implications for Neuroscience and Medical Research

Mritunjay, the paper's first author, emphasized that these 3D biological neural networks "not only help uncover the computing secrets of the brain but can also assist in understanding and possibly treating neurological diseases"

2

. The ability to create stable device-neural network interfaces opens new possibilities for studying neural development and disease progression

1

. The team aims to scale the system to handle increasingly complex tasks, potentially advancing both our understanding of how biological brains compute and creating practical alternatives to power-hungry conventional AI architectures. The study was published in Nature Electronics on April 23.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo