The brain is an extremely complex organ whose exact functioning remains difficult to understand. On average, the human brain contains 100 billion neurons that fire upon receiving input signals from multiple sensory organs. But, what is truly remarkable about our brain is the synchronization of this neural firing when triggered by a common input. Put simply, common inputs can generate a collective response in neurons that are not only spatially separated but also have different firing characteristics.
The neural synchronization has been observed before in experiments, and are commonly demonstrated during rest and activities involving tasks. However, the common inputs which produce this are typically unknown in real-world situations. This raises an interesting question: is it possible to reconstruct this input by looking at the output of the neurons?
In a new study published in Volume 106, Issue 3 of Physical Review E on 12 September 2022, a team of researchers from Japan, led by Professor Tohru Ikeguchi from Tokyo University of Science (TUS), set out to answer this question. The team, comprising Associate Professor Ryota Nomura of Waseda University (formerly TUS), and Associate Professor Kantaro Fujiwara of The University of Tokyo, looked at the firing rates of neurons and managed to reconstruct the input signal using a method called “superposed recurrence plot” (SRP).
“We developed a method that uses a recurrence plot (RP). RP was originally introduced to characterize nonlinear dynamical systems since they contain multidimensional information, despite providing only two-dimensional visualization,” explains Prof. Ikeguchi. “Since neurons are nonlinear dynamical systems, we can hypothetically obtain information about a common input if we balance out the effects of neural dynamics.”
The SRP method used by the team in their work is simply an RP in which a pixel value is summed across corresponding pixels of multiple RPs and then assigned a binary value of 0 or 1 based on whether the sum is equal to or greater than 1.
The team used the standard Izhikevich model to study the firings of uncoupled neurons. They considered three distinct cases of neuron firing patterns. In the first case, they reconstructed the common input for localized neurons with similar firing rates. In the second case, they did so for a mixture of neurons with different baseline firing rates. Finally, in the third case, they investigated if the SRP method could reconstruct a common input for a chaotic response of the Izhikevich model.
Sure enough, they found that they could reconstruct the input signal using the SRP method for chaotic neurons. “When we select an adequate time period to calculate the firing rates of neurons, we are able to reconstruct the input signal with fairly high accuracy,” says Prof. Ikeguchi. This represents a major breakthrough not only in the study of the brain and neural science but also other dynamical systems that show chaotic behavior.
The potential implications of their findings are huge for artificial intelligence, as Prof. Ikeguchi notes, “Current artificial intelligence models cannot truly reproduce the information processing power of our brains. This is because the neuron models used are too simplified and far from representative of the actual neurons in our brains. Our research brings us one step closer to understanding how the information process happens within our brains. This could pave the way for novel neuromorphic computing devices.” Additionally, it could help us understand the onset of mental health disorders better and devise treatments for them.
Overall, the study could be an eye-opener regarding how well (or little) we understand our brain.
***
Reference
DOI: https://doi.org/10.1103/PhysRevE.106.034205
About The Tokyo University of Science
Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japan's development in science through inculcating the love for science in researchers, technicians, and educators.
With a mission of “Creating science and technology for the harmonious development of nature, human beings, and society", TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of today's most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.
Website: https://www.tus.ac.jp/en/mediarelations/
About Professor Tohru Ikeguchi from Tokyo University of Science
Tohru Ikeguchi received his B.E., M.E., and D. E. degrees from Tokyo University of Science (TUS), Japan. After working for nearly a decade as a Full Professor at Saitama University, Japan, he joined TUS as a Full Professor at the Department of Management Science and worked there from 2014 to 2016. Since then, he has been a Full Professor at the Department of Information and Computer Technology in TUS. His research interests include nonlinear time series analysis, computational neuroscience, application of chaotic dynamics to solving combinatorial optimization problems, and complex network theory. He has published over 230 papers and proceedings.
Funding information
The study was supported by Grant-in-Aid for Scientific Research (C) (nos. JP17K00348, JP18KT0076 and JP21K12093), Grant-in-Aid for Scientific Research (B) (no. JP21H03514), Grant-in-Aid for Scientific Research (A) (no. JP20H00596), Grant-in-Aid for Challenging Research (Pioneering) (no. JP22K18419)and Moonshot R&D Grant Number JPMJMS2021.
Journal
Physical Review E
Method of Research
Computational simulation/modeling
Subject of Research
Not applicable
Article Title
Superposed recurrence plots for reconstructing a common input applied to neurons
Article Publication Date
12-Sep-2022
COI Statement
Nothing to declare.