China reveals the first AI similar to a brain. It’s 100 times faster than Western rivals

by Andrea
0 comments
China reveals the first AI similar to a brain. It's 100 times faster than Western rivals

China reveals the first AI similar to a brain. It's 100 times faster than Western rivals

Unlike systems such as ChatgPT, which use the entire network to respond to orders, the Chinese SpikingBain 1.0 system works similar to a brain, activating only specific parts according to the task requested.

Investigators at the Chinese Academy of Science Automation Institute in Beijing, developed a new artificial intelligence system, the SpikingBrain 1.0which describe as a large -scale language model “similar to the brain”, designed for the efficiency and independence of external hardware.

Unlike traditional transformers systems, such as chatgPT, which require huge computational power and memory, Spikingbrain 1.0 employs a method called peak computing. Inspired by the functioning of biological neurons, this active approach only specific parts of the network in response to the input instead of maintaining the entire system active. The result is a significantly lower energy consumption and faster processing times.

According to the investigation team, the system performed certain tasks up to 100 times faster than conventional models, requiring less than 2% of usual training data. To demonstrate the technology, the team built two versions: one with 7 billion parameters and one with 76 billion. Both were trained with approximately 150 billion tokens – a modest data set for industry standards.

Efficiency gains were especially notable when dealing with extremely long data sequences. In an experiment, the smallest model has processed a 4 million tokens prompt more than 100 times faster than a standard transformer. Another test showed a speed improvement of 26.5 times by generating the first token from a million tokens context.

The researchers also highlighted the stability of the model. The SpikingBain 1.0 ran for weeks of hundreds of metax chips, internally produced by the Integrated Circuits Co., sitting in Shanghai, marking an important step towards Reduce the dependence of the American company Nvidiacurrent global AI chips leader.

Potential system applications include the Analysis of extension legal and medical documentsresearch in high energy physics and the sequence of DNA. All of these fields require the management of vast sets of data quickly and efficiently, explains the.

“The major transformer -based language models face significant efficiency strangulations,” researchers have noted their coach, who has not yet been reviewed by pairs. “Our work is inspired by brain mechanisms.”

Investigators conclude that SpikingBain 1.0 “not only demonstrates the feasibility of efficient training of large -scale models on non -nvidia platforms, but also outlines new directions for the implementation and scalable application of brain -inspired models in future computing systems ”.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC