Submit manuscript...
MOJ
eISSN: 2576-4519

Applied Bionics and Biomechanics

Editorial Volume 8 Issue 1

Life-long learning and evolving associative memories in brain-inspired spiking neural networks

Nikola K Kasabov

Emeritus Professor, Department of Computer Science, Auckland University of Technology, New Zealand

Correspondence: Nikola K. Kasabov, Emeritus Professor, Department of Computer Science, Auckland University of Technology, Auckland, New Zealand; Institute for Information and Communication Technologies, Bulgarian Academy of Sciences, Sofia, Bulgaria; Computer Engineering Department, Dalian University, China, Tel +6421488214

Received: April 24, 2024 | Published: May 8, 2024

Citation: Kasabov NK. Life-long learning and evolving associative memories in brain-inspired spiking neural networks. MOJ App Bio Biomech. 2024;8(1):56-57. DOI: 10.15406/mojabb.2024.08.00208

Download PDF

Editorial

The paper argues that evolving associative memories (EAM), that are manifested in all biological systems and realised in the human brain through life-long learning (LLL), can be realised in brain-inspired computational architectures based on spiking neural networks (SNN). The paper points to the importance of the duality of the concepts of EAM and LLL for future AI systems.

Evolving associative memory (EAM) is perhaps one of the most fundamental principles of evolution in nature and in the development of living organisms, the ultimate result being the human brain. Still this principle is not much explored, neither in the brain nor in AI systems. EAM are natural or artificial systems that associate and capture incrementally and continuously related items, objects, processes and can be recalled using partial information. EAM are created continuously and manifested in all existing dynamical systems, from atoms, molecules and the Universe, to neural networks and the brain (Figure 1).

Figure 1 EAM is a basic information principle of all dynamical systems in nature, from atoms to the brain.

EAMs in biological systems and more specifically in the brain are created through life-long learning (LLL), where structures (e.g. clusters) of related items in time and space are created and modified continuously. On the other hand, LLL relies on adding new items to already existing structures based on commonality and similarity, so that LLL and EAM are dual principles of the same process. This duality involves different levels of molecular and neural functions in the brain, such as: neurogenesis; neuromodulation; episodic replay; metaplasticity; multisensory integration.1 LLL in the brain is the ultimate inspiration for LLL in artificial systems based on neural networks, and more specifically, on brain-inspired spiking neural network (SNN) architectures, where spatio-temporal connectionist structures are formed and modified continuously to form evolving spatio-temporal associative memories (ESTAM).2–5

As an AI machine learning model, an ESTAM is trained on a full set of spatio-temporal variables, but can be successfully recalled on only a subset of these variables measured in different time intervals. In addition, an ESTAM model can be further incrementally evolved on a new set of variables measured at different time windows. In3 ESTAM are built with the use of evolving spatio-temporal learning (ESTL) methods using SNN, where existing spatial-, temporal- and other multimodal data are integrated to train the model. The model captures evolvable and explainable spatio/spectro temporal patterns and can be further evolved on new data.

The idea of ESTAM using a brain-inspired SNN architecture NeuCube has been suggested in4 and it was further developed in.3 A NeuCube model processes information represented as spikes, forming binary time sequences. It has a 3D structure that is initialised using a brain template for brain data applications, but in general it can be initiliased in a different way, still accounting for similarity of input temporal variables.6 LLL in NeuCube is achieved in a connections way, where new connections are created and updated all the time and they can be recalled/activated using only partial input information based on the “synfire”7 and polychronisation8 principles. To achieve LLL in a brain-inspired SNN architecture, several methods can be used in a concert, such as: integrated spike-time and error backpropagation learning;5,9 neuromodulatory synaptic connections;10 synaptic weight regulation;5 homeostasis;11 Lyapunov energy function;12 evolving classifiers, where output neurons are evolved and aggregated continuously from data.13 A NeuCube based ESTAM learns through transfer learning, always evolving and reporting fuzzy spatio-temporal rules.14,15

A major advantage of building EAM with SNN through LLL is that such models are continuously trainable on new multimodal data and can be recalled on smaller data sets and missing modalities, allowing for efficient and early future event prediction. This has already been demonstrated on several AI problems, such as: brain neuroimaging data classification;16 moving object recognition using audio-visual data;5 financial and economic data prediction;17 and other. LLL and EAM at a personal level are key concepts for the process of aging-well and perhaps for reverse aging18,19 and definitely, major concepts to be achieved in future AI systems.

Acknowledgments

The work is supported by Auckland University of Technology and Knowledge Engineering Consulting Ltd (https://knowledgeengineering.ai). A NeuCube software development system that allows for building ESTAM is available in both Matlab and Python versions from: https://www.aut.ac.nz/neucube.

Funding

None.

Conflicts of interest

The author declares that there is no conflict of interest.

References

  1. Kudithipudi D, Aguilar-Simon M, Babb J, et al. Biological underpinnings for lifelong learning machines. NatMI. 2022;4:196–210.
  2. Parisi GI, Kemker R, Part JL, et al. Continual lifelong learning with neural networks: a review. Neural Netw. 2019;113:54–71.
  3. Kasabov N. STAM-SNN: Spatio-temporal associative memories in brain-inspired spiking neural networks: concepts and perspectives. TechRxiv. Preprint. 2023.
  4. Kasabov N. NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data. Neural Netw. 2014;52:62–76.
  5. Kasabov N. Time-space, spiking neural networks and brain-inspired artificial intelligence. Springer-Nature. 2019.
  6. Tu E, Kasabov N, Yang J. Mapping temporal variables into the NeuCube spiking neural network architecture for improved pattern recognition and predictive modelling. IEEE Trans Neural Netw Learn Syst. 2017;28(6):1305–1317.
  7. Abeles M. Corticonics: neural circuits of the cerebral cortex. Cambridge University Press, NY. 1991.
  8. Izhikevich E. Polychronization: computation with spikes. Neural Comput. 2006;18(2):245–282.
  9. Saeedinia SA, Jahed-Motlagh MR, Tafakhori A, et al. Design of MRI structured spiking neural networks and learning algorithms for personalized modelling, analysis, and prediction of EEG signals. Sci Rep. 2021;11(1):12064.
  10. Espinosa-Ramos J, Capecci E, Kasabov N. A computational model of neuroreceptor-dependent plasticity (NRDP) based on spiking neural networks. IEEE Trans Cogn Dev Sys. 2019;11(1):63–72.
  11. Widrow B. Cybernetics 2.0: A general theory of adaptivity and homeostasis in the brain and in the body. Springer. 2023.
  12. Bahrami H. PhD Thesis, Auckland University of Technology, 2023.
  13. Kasabov N, Dhoble K, Nuntalid N, et al. Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition. Neural Netw. 2013;41:188–201.
  14. Kasabov N, Tan Y, Doborjeh M, et al. Transfer learning of fuzzy spatio-temporal rules in the NeuCube brain-inspired spiking neural network: a case study on EEG spatio-temporal data. IEEE Trans Fuzzy Sys. 2023; 31(12):4542–4552.
  15. Kumarasinghe K, Kasabov N, Taylor D. Deep learning and deep knowledge representation in spiking neural networks for brain-computer interfaces. Neural Netw. 2020;121:169–185.
  16. Kasabov N, Bahrami H, Doborjeh M, et al. Brain inspired spatio-temporal associative memories for neuroimaging data: EEG and fMRI. Bioengineering. 2023;10(12):1341.
  17. AbouHassan I, Kasabov N, Bankar T, et al. PAMeT-SNN: predictive associative memory for multiple time series based on spiking neural networks with case studies in economics and finance. (PrePrint), TechRxiv. 2023.
  18. Szu H. Editorial about reverse aging. MOJ App Bio Biomech. 2022;6(1):1.
  19. Szu HH, Lum P, Tang MJ, et al. Minimizing digital & analog bio-information loss for aging toward reversing. MOJ App Bio Biomech. 2022;6(1):35‒43.
Creative Commons Attribution License

©2024 Kasabov. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.