In this work, we introduce a novel computational framework inspired by the physics of memristive devices and systems, which we embed into the context of Recurrent Neural Networks (RNNs) for time-series processing. Our proposed memristive-friendly neural network architecture leverages both the principles of Reservoir Computing (RC) and fully trainable RNNs, providing a versatile platform for sequence learning. We provide a mathematical analysis of the stability of the resulting neural network dynamics, identifying the role of crucial RC-based architectural hyper-parameters. Through numerical simulations, we demonstrate the effectiveness of the proposed approach across diverse regression and classification tasks, showcasing performance that is competitive with both traditional RC and fully trainable RNN systems. Our results highlight the scalability and adaptability of memristive-inspired computational architectures, offering a promising path toward efficient neuromorphic computing for complex sequence-based applications. (c) 2025 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International (CC BY-NC-ND) license (https://creativecommons.org/licenses/by-nc-nd/4.0/).

A memristive computational neural network model for time-series processing / Pistolesi, Veronica; Ceni, Andrea; Milano, Gianluca; Ricciardi, Carlo; Gallicchio, Claudio. - In: APL MACHINE LEARNING. - ISSN 2770-9019. - 3:1(2025). [10.1063/5.0255168]

A memristive computational neural network model for time-series processing

Milano, Gianluca;
2025

Abstract

In this work, we introduce a novel computational framework inspired by the physics of memristive devices and systems, which we embed into the context of Recurrent Neural Networks (RNNs) for time-series processing. Our proposed memristive-friendly neural network architecture leverages both the principles of Reservoir Computing (RC) and fully trainable RNNs, providing a versatile platform for sequence learning. We provide a mathematical analysis of the stability of the resulting neural network dynamics, identifying the role of crucial RC-based architectural hyper-parameters. Through numerical simulations, we demonstrate the effectiveness of the proposed approach across diverse regression and classification tasks, showcasing performance that is competitive with both traditional RC and fully trainable RNN systems. Our results highlight the scalability and adaptability of memristive-inspired computational architectures, offering a promising path toward efficient neuromorphic computing for complex sequence-based applications. (c) 2025 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International (CC BY-NC-ND) license (https://creativecommons.org/licenses/by-nc-nd/4.0/).
File in questo prodotto:
File Dimensione Formato  
016117_1_5.0255168.pdf

accesso aperto

Tipologia: final published article (publisher’s version)
Licenza: Creative Commons
Dimensione 5.98 MB
Formato Adobe PDF
5.98 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11696/86705
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact