Interplay of random and nonnormally structured connectivity in the dynamics of neural networks
Yashar Ahmadian (U. Columbia)
Neuronal networks exhibit significant randomness in their synaptic connectivity. But importantly, alongside randomness, the synaptic connectivity of most neural networks also features ordered structure on various levels, depending on the network's function. Investigating the interplay of these two features of connectivity and their respective role in the dynamics of neural networks and the computations they perform constitutes a general theoretical problem in neuroscience. Of particular interest are connectivity structures that can be described by a nonnormal matrix. In this case the network can be described as having a hidden feedforward connectivity structure between orthogonal activity patterns, each of which can also excite or inhibit itself. Such networks arise naturally from the separation of excitatory and inhibitory neurons and yield large transient amplification of patterns without any dynamical slowing. This latter effect has been used to explain the similarity of the fluctuating patterns of spontaneous activity in primary visual cortex (V1) to patterns of activity evoked by visual stimuli.
In my second talk, I will present the results of a recent project where as a step towards the general problems mentioned above, I studied properties of large connectivity matrices of the formÂ W = M + J, whereÂ M (average connectivity) is an arbitrary deterministic matrix which represents structure in the connectivity and is generally nonnormal, and J is a zero-mean random matrix with possibly correlated and non-uniformly scaled elements. Specifically, using the Feynman diagram technique, we have derived a general formula for the eigenvalue distribution of matrices of the above type, generalizing the circular law for fully random matrices.Â Furthermore, with the aim of studying the effect of random connectivity on the hidden feedforward structure and transient amplification that are of interest in the context of nonnormal connectivity matrices, we have derived general formulae for the transient evolution of the magnitude and the frequency power spectrum of the linear response of firing rate networks to external inputs.Â I will present some example applications relevant for neuroscience, and in particular briefly discuss how our general formula for the eigenvalue distribution has been used in the study of a clustered neural network with random inter-cluster connectivity by our colleagues (M. Stern and L. Abbott), to map out the boundary between a chaotic and a glassy phase of the network.