TY - JOUR
T1 - Bayesian continual learning via spiking neural networks
AU - Skatchkovsky, Nicolas
AU - Jang, Hyeryung
AU - Simeone, Osvaldo
N1 - Publisher Copyright:
Copyright © 2022 Skatchkovsky, Jang and Simeone.
PY - 2022/11/16
Y1 - 2022/11/16
N2 - Among the main features of biological intelligence are energy efficiency, capacity for continual adaptation, and risk management via uncertainty quantification. Neuromorphic engineering has been thus far mostly driven by the goal of implementing energy-efficient machines that take inspiration from the time-based computing paradigm of biological brains. In this paper, we take steps toward the design of neuromorphic systems that are capable of adaptation to changing learning tasks, while producing well-calibrated uncertainty quantification estimates. To this end, we derive online learning rules for spiking neural networks (SNNs) within a Bayesian continual learning framework. In it, each synaptic weight is represented by parameters that quantify the current epistemic uncertainty resulting from prior knowledge and observed data. The proposed online rules update the distribution parameters in a streaming fashion as data are observed. We instantiate the proposed approach for both real-valued and binary synaptic weights. Experimental results using Intel's Lava platform show the merits of Bayesian over frequentist learning in terms of capacity for adaptation and uncertainty quantification.
AB - Among the main features of biological intelligence are energy efficiency, capacity for continual adaptation, and risk management via uncertainty quantification. Neuromorphic engineering has been thus far mostly driven by the goal of implementing energy-efficient machines that take inspiration from the time-based computing paradigm of biological brains. In this paper, we take steps toward the design of neuromorphic systems that are capable of adaptation to changing learning tasks, while producing well-calibrated uncertainty quantification estimates. To this end, we derive online learning rules for spiking neural networks (SNNs) within a Bayesian continual learning framework. In it, each synaptic weight is represented by parameters that quantify the current epistemic uncertainty resulting from prior knowledge and observed data. The proposed online rules update the distribution parameters in a streaming fashion as data are observed. We instantiate the proposed approach for both real-valued and binary synaptic weights. Experimental results using Intel's Lava platform show the merits of Bayesian over frequentist learning in terms of capacity for adaptation and uncertainty quantification.
KW - Bayesian learning
KW - artificial intelligence
KW - neuromorphic hardware
KW - neuromorphic learning
KW - spiking neural networks
UR - http://www.scopus.com/inward/record.url?scp=85143201255&partnerID=8YFLogxK
U2 - 10.3389/fncom.2022.1037976
DO - 10.3389/fncom.2022.1037976
M3 - Article
AN - SCOPUS:85143201255
SN - 1662-5188
VL - 16
JO - Frontiers in Computational Neuroscience
JF - Frontiers in Computational Neuroscience
M1 - 1037976
ER -