InterJournal Complex Systems, 1785
Status: Accepted
Manuscript Number: [1785]
Submission Date: 2006
Topological and dynamical structures induced by Hebbian learning in random neural networks
Author(s): Hugues Berry ,Benoît Siri ,Bruno Cessac ,Bruno Delord

Subject(s): CX.18

Category:

Abstract:

"Topological and dynamical structures induced by Hebbian learning in random neural networks" B. Siri, H. Berry, B. Cessac, B. Delord, M. Quoy In recent years, a vast amount of work concerning dynamical systems interacting on complex networks has focused on the influence of network topology on the global dynamics. In this framework, neural networks are particularly interesting because the dynamics of the neurons (the network nodes) depends on synaptic weights (the network links) that themselves vary over time ("learning") as a function of the neuron dynamics. This mutual coupling between node dynamics and network topology remains largely obscure. Here, we study the consequences of such a coupling on dynamics and architecture. To this end, we investigate the influence of learning on the topology of random recurrent neural networks, which exhibit learning and dynamical behaviors yielding associative memory properties that mimic those observed in the olfactory bulb. The state of the neurons evolves over time through classical firing-rate dynamics. We investigate several learning rules to update synaptic strength. These rules are simple implementations of Hebb's rule for learning in biological neurons (i.e. neurons which fire together become more tightly coupled). Due to the aforementioned coupling, learning shapes the network dynamics, topology and function. We evidence that the modifications of the dynamics can be related to changes in the local loop content. We further show that, because of these local structural alterations, the global network topology changes as well. Indeed, under the influence of learning, the distribution of the strong synapses on the network is no more homogeneous, i.e. two neurons have an increasing probability to be strongly coupled if they are both connected to a third neuron by strong synapses. Hence the resulting network is highly clustered. Besides, its mean-shortest path remains low, so that these learning rules organize the network as a small-world one. We obtain some criteria to discriminate between rules giving rise to such organizations from those that do not. Hence, these findings raise the hypothesis that small-worldness in natural neural networks may be a spontaneous consequence of the learning scheme governing the network links. Moreover, we show that pattern recognition task emerges from this mutual coupling, thus questioning the relevance of small-world architectures for storing and processing information.

Retrieve Manuscript
Submit referee report/comment


Public Comments: