Chinese researchers have unveiled a groundbreaking AI model that mimics the behavior of neurons in the human brain, potentially revolutionizing the future of artificial intelligence. This new model, developed by a team from the Chinese Academy of Sciences’ Institute of Automation and Peking University, promises to deliver powerful computational abilities without the high energy consumption associated with traditional silicon-based processors.
The research team aimed to bridge the gap between the complex workings of large AI models and the intricate, yet energy-efficient, operations of the human brain. While AI systems have rapidly expanded in capability, their increasing demand for energy has become a growing concern. In contrast, the human brain, which is far more complex than any existing AI model, operates on a mere fraction of the energy.
Recognizing this disparity, the researchers set out to develop an AI model capable of performing more tasks with significantly reduced energy consumption. Their solution, referred to as the “internal complexity model,” seeks to replicate the brain’s inner workings to achieve cognitive tasks using minimal energy.
According to a report by Xinhua, tests conducted by the research team demonstrated the model’s effectiveness across various tasks. The experiments indicated that this internal complexity model could pave the way for new methods in integrating neuroscience’s dynamic characteristics into AI. Additionally, it offers practical solutions for optimizing AI performance, potentially making future AI models more efficient and versatile.
In their study, the researchers constructed a Hodgkin-Huxley (HH) network, which features rich internal complexity, to validate their approach. They proved that this network could match the performance of much larger AI models built using the leaky integrate-and-fire (LIF) network, which relies on simpler internal complexity.
The findings challenge the prevailing trend in AI research, which has focused on building increasingly larger and more complex neural networks. The team suggests an alternative approach, where a “small model with internal complexity” could incorporate rich neuronal properties to construct more efficient AI systems. This approach could eliminate the need to expand network scale externally, thereby reducing energy requirements.
This innovative research highlights a potential shift in AI development, moving away from energy-intensive models toward more sustainable and biologically inspired solutions. The implications of this work could lead to more efficient AI applications across various fields, ultimately reducing the environmental impact of AI technology.
By Impact Lab