News Release

A single-core-neuron framework unlocks efficient deep learning

Peer-Reviewed Publication

Science China Press

Overview of the OCNS as a “small model” for time-series predicting.

image: 

(a) The framework of the OCNS is similar to that of an autoencoder. For an observed high-dimensional vector Xt, a latent delay vector Zt, comprised of the dynamics of a one-dimensional delay dynamical system zt, is constructed via the input weight A and OCN Φ through a delay-embedding scheme. The delay vector Zt corresponding to time t contains the latent temporal information from the delay dynamical system zt, which can topologically reconstruct all the dynamics of the original system Xt. With the output weight B, the original spatial information Xt of the original system can be recovered from Zt. The OCN Φ, which generates a delay dynamical system zt with D delays feedback in a single neuron-based fashion, is the core of the OCNS. (b) Derived from the solid theoretical foundation of delay dynamical systems and the delay embedding theorem, the information flow of the OCNS is dictated by the OCNS-based STI equations, which encompass both the primary and conjugate STI equations. Here, the researchers build the delay vector Zt+i = [zt+i-S+1, zt+i-S+2, …, zt+i]' S at time t + i, where i = 0,1,2, …, and S is the delay-embedding dimension. Specifically, the input weight A and OCN Φ transform the spatial information in the original attractor A into the temporal information of the delayed attractor N corresponding to the primary STI equation, while the conjugate STI equation represents the reconstruction and prediction of the original system constrained on attractor A from the delayed attractor N through the output weight B. In this way, the OCNS effectively consists of an RNN with one neuron and two linear layers.

view more 

Credit: ©Science China Press

The deep learning field has been dominated by “large models” requiring massive computational resources and energy, leading to unsustainable environmental and economic challenges. To address this, researchers have developed the one-core-neuron system (OCNS), a novel framework that minimizes model size while maintaining high performance.

This work was led by Prof. Rui Liu (School of Mathematics, South China University of Technology) and Prof. Luonan Chen (Center for Excellence in Molecular Cell Science, Chinese Academy of Sciences), with significant contributions from Dr. Hao Peng (School of Mathematics, South China University of Technology) and Prof. Pei Chen (School of Mathematics, South China University of Technology), who were responsible for designing the OCNS framework, collecting data, and conducting extensive experiments. Their collaborative efforts ensured rigorous evaluation and a comprehensive demonstration of the model’s capabilities.

Unlike traditional large models that rely on billions of parameters, the OCNS employs a single neuron to encode high-dimensional data into a one-dimensional time-series representation, both theoretically and computationally. Based on the delay embedding theorem, the team designed this “small model” framework with spatiotemporal information (STI) transformation and multiple delayed feedback. This approach ensures precise forecasting while requiring, on average, only 0.035% of the parameters used in “large models”. Applications range from time-series prediction to image classification tasks, making OCNS an efficient and versatile deep-learning tool.

The research team evaluated OCNS on synthetic datasets and real-world time-series data, including weather predictions and electricity consumption. Results showed that OCNS consistently matched or outperformed existing benchmarks, even under noise conditions.

This study opens new avenues for constructing efficient AI models with reduced energy footprints, aligning with the vision of sustainable and green AI development. As a compact model, OCNS demonstrates that efficiency and performance can coexist, providing a transformative perspective for future deep learning architectures.

See the article:

One-core neuron deep learning for time series prediction

https://doi.org/10.1093/nsr/nwae441


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.