Sequence learning with the BCPNN learning rule
Time: Thu 2019-05-09 11.15 - 12.15
Location: Room 4423, Lindstedtsvägen 5, KTH, Stockholm
Participating: Ramón Heberto Martínez Mayorquin, CST/EECS/KTH
The Bayesian Confidence Propagator Neural Network (BCPNN) has been used successfully to model learning in attractor memory neural networks, often conceptualized to describe computational aspects of supragranular cortical activity . While BCPNN has been demonstrated to allow for encoding sequential activity in attractor networks , a full systematic account of its capabilities has not been developed yet. In this talk, I will present advances in this direction by developing a formal account of the BCPNN learning abilities in the temporal domain with an emphasis in sequence learning, which constitutes the main contribution of my doctoral research . In particular, I will demonstrate that 1) A mathematical relationship between the temporal aspects of the input and the weights that clarify what exactly is encoded in the connectivity matrix. 2) An analytical characterization of the relationship between the structure of the connectivity matrix, the dynamical network parameters and the temporal aspects of the recall process. 3) The effects of modularity on the recall dynamics, robustness to noise and the ability of the network to store multiple sequences. Finally, I discuss my current work in the storage capacity of the network and possible applications to chunking in sequences.
Lansner, Anders. "Associative memory models: from the cell-assembly theory to biophysically detailed cortex simulations." Trends in neurosciences 32.3 (2009): 178-186.
Tully, Philip J., et al. "Spike-based Bayesian-Hebbian learning of temporal sequences." PLoS computational biology 12.5 (2016): e1004954.
Martinez, Ramon Heberto, Pawel Herman, and Anders Lansner. "Probabilistic associative learning suffices for learning the temporal structure of multiple sequences." bioRxiv (2019): 545871.