The deep learning field has been dominated by “large models” requiring massive computational resources and energy, leading to unsustainable environmental and economic challenges. To address this, researchers have developed the one-core-neuron system (OCNS), a novel framework that minimizes model size while maintaining high performance.
This work was led by Prof. Rui Liu (School of Mathematics, South China University of Technology) and Prof. Luonan Chen (Center for Excellence in Molecular Cell Science, Chinese Academy of Sciences), with significant contributions from Dr. Hao Peng (School of Mathematics, South China University of Technology) and Prof. Pei Chen (School of Mathematics, South China University of Technology), who were responsible for designing the OCNS framework, collecting data, and conducting extensive experiments. Their collaborative efforts ensured rigorous evaluation and a comprehensive demonstration of the model’s capabilities.
Unlike traditional large models that rely on billions of parameters, the OCNS employs a single neuron to encode high-dimensional data into a one-dimensional time-series representation, both theoretically and computationally. Based on the delay embedding theorem, the team designed this “small model” framework with spatiotemporal information (STI) transformation and multiple delayed feedback. This approach ensures precise forecasting while requiring, on average, only 0.035% of the parameters used in “large models”. Applications range from time-series prediction to image classification tasks, making OCNS an efficient and versatile deep-learning tool.
The research team evaluated OCNS on synthetic datasets and real-world time-series data, including weather predictions and electricity consumption. Results showed that OCNS consistently matched or outperformed existing benchmarks, even under noise conditions.
This study opens new avenues for constructing efficient AI models with reduced energy footprints, aligning with the vision of sustainable and green AI development. As a compact model, OCNS demonstrates that efficiency and performance can coexist, providing a transformative perspective for future deep learning architectures.
See the article:
One-core neuron deep learning for time series prediction
https://doi.org/10.1093/nsr/nwae441
Journal
National Science Review