2 Sources
2 Sources
[1]
China unveils 'world's first' brain-like AI, 100x faster on local tech
The system, which used hundreds of MetaX chips, ran stably for weeks at a time. Researchers at the Chinese Academy of Sciences' Institute of Automation in Beijing have introduced a new artificial intelligence system called SpikingBrain 1.0. Described by the team as a "brain-like" large language model, it is designed to use less energy and operate on homegrown Chinese hardware rather than chips from industry leader Nvidia. "Mainstream Transformer-based large language models (LLMs) face significant efficiency bottlenecks: training computation scales quadratically with sequence length, and inference memory grows linearly," said the researchers in a non-peer-reviewed technical paper.
[2]
China develops SpikingBrain1.0, a brain-inspired AI model
Chinese researchers from the Chinese Academy of Sciences have unveiled SpikingBrain1.0, described as the world's first "brain-like" large language model (LLM). The model is designed to consume less power and operate independently of Nvidia GPUs, addressing limitations of conventional AI technologies. Existing models, including ChatGPT and Meta's Llama, rely on "attention," a process that compares every word in a sentence to all others to predict the next word. While effective, this approach consumes large amounts of energy and slows processing for long texts, such as books. Traditional models also depend heavily on Nvidia GPUs, creating hardware bottlenecks for scaling. SpikingBrain1.0 uses a localized attention mechanism, focusing on nearby words rather than analyzing entire texts. This mimics the human brain's ability to concentrate on recent context during conversations. Researchers claim this method allows the model to function 25 to 100 times faster than conventional LLMs while maintaining comparable accuracy. The model runs on China's homegrown MetaX chip platform, eliminating reliance on Nvidia GPUs. It selectively responds to input, reducing power consumption and enabling continual pre-training with less than 2% of the data needed by mainstream open-source models. The researchers note that in specific scenarios, SpikingBrain1.0 can achieve over 100 times the speed of traditional AI models. The development of SpikingBrain1.0 comes amid U.S. technology export restrictions that limit China's access to advanced chips required for AI and server applications. These restrictions have accelerated domestic AI innovation, with SpikingBrain1.0 representing a step toward a more self-sufficient AI ecosystem.
Share
Share
Copy Link
Chinese researchers have developed SpikingBrain 1.0, a new AI system that claims to be 100 times faster than traditional models while using less energy. This breakthrough could potentially reshape the AI landscape and reduce China's dependence on foreign technology.
Researchers at the Chinese Academy of Sciences' Institute of Automation in Beijing have unveiled a groundbreaking artificial intelligence system called SpikingBrain 1.0, touted as the world's first 'brain-like' large language model (LLM)
1
2
. This innovative AI model promises to address key limitations of conventional AI technologies, potentially reshaping the landscape of artificial intelligence.SpikingBrain 1.0 distinguishes itself from mainstream Transformer-based LLMs by tackling significant efficiency bottlenecks. Traditional models face challenges with training computation scaling quadratically with sequence length and inference memory growing linearly
1
. To overcome these limitations, SpikingBrain 1.0 employs a localized attention mechanism, focusing on nearby words rather than analyzing entire texts. This approach mimics the human brain's ability to concentrate on recent context during conversations2
.The researchers claim that this novel method allows the model to function 25 to 100 times faster than conventional LLMs while maintaining comparable accuracy. In specific scenarios, SpikingBrain 1.0 can achieve over 100 times the speed of traditional AI models
2
.One of the key advantages of SpikingBrain 1.0 is its reduced energy consumption. The model selectively responds to input, which not only decreases power usage but also enables continual pre-training with less than 2% of the data needed by mainstream open-source models
2
.Moreover, SpikingBrain 1.0 runs on China's homegrown MetaX chip platform, eliminating reliance on Nvidia GPUs. The system, which used hundreds of MetaX chips, demonstrated stable operation for weeks at a time
1
. This hardware independence is particularly significant given the current geopolitical context.Related Stories
The development of SpikingBrain 1.0 comes at a crucial time for China's AI industry. Recent U.S. technology export restrictions have limited China's access to advanced chips required for AI and server applications. These restrictions have accelerated domestic AI innovation, with SpikingBrain 1.0 representing a significant step toward a more self-sufficient AI ecosystem
2
.By creating an AI model that operates efficiently on domestically produced hardware, China is positioning itself to reduce its dependence on foreign technology while pushing the boundaries of AI capabilities. This development could have far-reaching implications for the global AI landscape and the balance of technological power between nations.
Summarized by
Navi
[1]
1
Business and Economy
2
Business and Economy
3
Policy and Regulation