Neuroscience
modeling
and neuronal networks dynamics.

Dynamical analysis of neural networks

(i) To propose "good" models, namely biologically relevant and mathematically well posed ;

(ii) To make the analysis of these model-dynamics, with rigorous and analytical results if possible, and with a good control on numerical simulations ;

(iii) To interpret and extrapolate these results and compare them to real neuronal networks behaviour.

This constitutes a real
challenge since neuronal networks are dynamical systems
with a huge number of degree of freedom and parameters,
multi-scales organisation with complex interactions. For
example neurons dynamics depend on synaptic
connexions while synapses evolve according to the
neurons activity. Analysing this interwoven activity
requires the development of new methods. Methods coming
from theoretical physics and applied mathematics can be
adapted to produce efficient analysis tools while
providing useful concepts. I am developing such methods
based on statistical physics (mean-field theory, Gibbs
distribution analysis), dynamical systems theory (global
stability, bifurcation analysis, characterisation of
chaotic dynamics) and ergodic theory (symbolic coding,
thermodynamic formalism). I believe that such an
analysis is an important step towards the
characterisation of in vitro or in vivo neuronal
neworks, from space scales corresponding to a few
neurons to scales characterising e.g. cortical columns.
With my colleagues, we have characterized the
dynamics of several firing rate and spiking neurons
models (see publication list below).

**Spike
train
analysis**. Neurons activity results from
complex and nonlinear mechanisms leading to a wide
variety of dynamical behaviours. This activity is
revealed by the emission of action potentials or
``spikes''. While the shape of an action potential is
essentially always the same for a given neuron,the
succession of spikes emitted by this neuron can have a
wide variety of patterns (isolated spikes, periodic
spiking, bursting, tonic spiking, tonic bursting, ...),
depending on physiological parameters, but also on
excitations coming either from other neurons or from
external inputs. Thus, it seems natural to consider
spikes as ``information quanta'' or ``bits'' and
to seek the information exchanged by neurons in the
structure of spike trains. Doing this, one switches from
the description of neurons in terms of membrane
potential dynamics, to a description in terms of spikes
trains. This point of view is used, in experiments, by
the analysis of raster plots, i.e. the activity of
a neuron is represented by a mere vertical bar each time
this neuron emits a spike. Though this change of
description raises many questions, it is commonly
admitted in the computational neuroscience community
that spike trains constitute a ``neural code''. This
raises however other questions. How is ``information''
encoded in a spike train ? How to measure the
information content of a spike train ? As a matter of
fact, a prior to handle ``information'' in a spike train
is the definition of a suitable probability distribution
that matches the empirical averages obtained from
measures and there is currently a wide debate on the
canonical form of these probabilities. We are developing
methods for the characterisation of spike trains from
empirical data. On one hand, we have shown how Gibbs
distribution (in a more general sense that statistical
mechanics, see e.g. refs 1,3,4 below) are natural
candidate for spike train statistics. On the other hand,
we are developing numerical methods for the
characterization of Gibbs distribution from experimental
data (see the webpage http://enas.gforge.inria.fr/v3/).

Mean-field analysis
to neuronal networks. This method, well
known in the field of statistical physics and quantum field
theory, is used in the field of neural networks
dynamics with the aim of modeling neural activity
at scales integrating the effect of thousands of
neurons. This is of central importance for several
reasons. First, most imaging techniques
are not able to measure individual neuron activity
(``microscopic'' scale), but are instead
measuring mesoscopic effects resulting from the
activity of several hundreds to several hundreds
of thousands of neurons. Second, anatomical data
recorded in the cortex reveal the existence of structures, such
as cortical columns with a diameter of
about 50
micrometers
to 1
millimeter, containing of the order of one hundred
to one thousand neuronsbelonging to a few
different species. In this case, information
processing does not occur at the scale of
individual neurons but rather corresponds to an
activity integrating the collective dynamics of many interacting
neurons and resulting in a mesoscopic signal. Dynamic mean-field
theory allows to obtain the equations of evolution
of the effective mean-field from microscopic
dynamics in several model-examples. We have obtained
them in several examples of discrete time neural
networks with firing rates, and derived rigourous
results on the mean-field dynamics of models with
several populations allowing the rederivation of
classical phenomenological equations for cortical
columns such as Jensen-Ritt's.We are now
developping a mean-field theory for correlated
synaptic weights.

**
Dynamical effects induced by synaptic and intrinsic
plasticity**. This collaboration aims to
understand how structure of biological neural networks
is conditioning their functional capacities, in
particular learning. On one hand we are analysing the
dynamics of neural network models submitted to Hebbian
learning and investigating how the capacity to recognize
objects emerges. What are the effect on dynamics and
topology ? For this we are using concepts coming
from random networks, and non linear analysis (see next
item). On the other hand, we are using the informations
obtained via this analysis to construct artificial
neural network architecture able to learn basic objects
and then to perform generalization by emergent dynamics.
A typical example is the architecture of the V1 cortex
(vision) that we are using as a guideline. In the long
term, the goal would be to produce new computer
architectures inspired from biological networks.

**Interplay between synaptic graph
structure and topology.** Neuronal networks can
be regarded as graphs where each neuron is a vertex and
each synaptic connection is an edge. The models have
usually simple topologies (e.g. feed forward or
recurrent neural networks) but recent research on
nervous and brain systems suggests that the actual
topology of a real-world neuronal system is much more
complex : small-world and scale-free properties are
for example observed in human brain networks. There is
also a complex interplay between the topological
structure of the synaptic graph and the non linear
evolution of the neurons. Thus, the existence of
synapses between a neuron (A) and another one (B) is
implicitly attached to a notion of ``influence’’ or
causal and directed action from A to B. However, the
neuron B receives usually synapses from many other
neurons, each of them being ``influenced’’ by many other
neurons, possibly acting on A, etc... Thus, the actual
``influence’’ or action of A on B has to be considered
dynamically and in a global sense, by considering A and
B not as isolated objects, but, instead, as entities
embedded in a system with a complex interwoven dynamical
evolution. It is thus necessary to develop tools
allowing to handle this interplay. In this spirit we are
using the linear response approach (see here
for details ). These results could lead to new
directions in neural network analysis and more generally
in the analysis of non linear dynamical systems on
graphs. However the results quoted above were obtained
in a specific model example and further investigations
must be done, in a more general setting. In this spirit,
the present project aims to explore two directions.
Recurrent model with spiking neurons (see item 1 above)
and Complex architecture and learning (item 2 above).

- Rodrigo Cofré, Bruno Cessac, "Exact computation of the maximum-entropy potential of spiking neural-network models",Phys. Rev. E 89, 052117.
- Hassan Nasser, Bruno Cessac, Parameters estimation for spatio-temporal maximum entropy distributions: application to neural spike trains, Entropy 2014, 16(4), 2244-2277; doi:10.3390/e16042244.
- Jeremie Naudé, Bruno Cessac, Hugues Berry, and Bruno Delord, "Effects of Cellular Homeostatic Intrinsic Plasticity on Dynamical and Computational Properties of Biological Recurrent Neural Networks" The Journal of Neuroscience, 18 September 2013, 33(38): 15032-15043; doi: 10.1523/JNEUROSCI.0870-13.2013.
- B. Cessac
and R. Cofré, Spike train statistics and Gibbs
distributions, J. Physiol.
Paris,
Volume 107, Issue 5, Pages 360-368 (November 2013).
Special issue: Neural Coding and Natural Image
Statistics.

- Rodrigo
Cofré and Bruno Cessac Dynamics and spike trains
statistics in conductance-based Integrate-and-Fire
neural networks with chemical and electric synapses,
*Chaos, Solitons & Fractals*,*Volume 50*,*May 2013*,*Pages 13-31*. - Hassan
Nasser, Olivier Marre, and Bruno Cessac. Spike
trains analysis using gibbs distributions and
monte-carlo method",
*J. Stat. Mech.*(2013) P03006. - H.
Rostro-Gonzalez, , B. Cessac, T. Viéville, “
Parameters estimation in spiking neural networks: a
reverse-engineering approach », J. Neural Eng. 9
(2012) 026024.

- The role of the asymptotic dynamics in the design of FPGA-based hardware implementations of gIF-type neural networks, Horacio Rostro-Gonzalez, Bruno Cessac, Bernard Girau, Cesar Torres-Huitzil, J. Physiol. Paris, vol. 105, n° 1–3, pages 91—97, (2011).
- J.C. Vasquez, A. Palacios, O. Marre, M.J. Berry II, B. Cessac, Gibbs distribution analysis of temporal correlation structure on multicell spike trains from retina ganglion cells, J. Physiol. Paris, Volume 106, Issues 3–4, May–August 2012, Pages 120–127.
- Cessac,
B (2011) Statistics of spike trains in
conductance-based neural networks: Rigorous
results,
*The Journal of Mathematical Neuroscience*2011, 1:8 (2011). - Cessac,
B (2010) A discrete time neural network
model with spiking neurons: II: Dynamics
with noise.
*J Math Biol, Journal of Mathematical Biology: Volume 62, Issue 6 (2011), Page 863-900*. - B. Cessac, H. Paugam-Moisy, T. Viéville, "Overview of facts and issues about neural coding by spike", J. Physiol., Paris, 104, (1-2), 5-18, (2010).
__B. Cessac, ``Neural Networks as dynamical systems'',__International Journal of Bifurcations and Chaos, Volume: 20, Issue: 6(2010) pp. 1585-1629 DOI: 10.1142/S0218127410026721.- B. Cessac, H. Rostro, J.C. Vasquez, T. Viéville , “How Gibbs distributions may naturally arise from synaptic adaptation mechanisms", J. Stat. Phys,136, (3), 565-602 (2009).
- O. Faugeras, J. Touboul, B. Cessac, “A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs”, Front. Comput. Neurosci. (2009) 3:1.
- B. Cessac, Viéville T., ``On Dynamics of Integrate-and-Fire Neural Networks with Adaptive Conductances.'', Front. Comput. Neurosci. (2008) 2:2.
- Siri B., Berry H., Cessac B., Delord B., Quoy M., « A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks », Neural Comp., vol 20, num 12, (2008), pp 2937-2966.
- B. Cessac ``A discrete time neural network model with spiking neurons. Rigorous results on the spontaneous dynamics'', J. Math. Biol., Volume 56, Number 3, 311-345 (2008).
- Siri, B., Quoy, M., Cessac, B., Delord, B. and Berry, H., ``Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons''. Journal of Physiology (Paris),101(1-3):138-150 (2007).
- Cessac B., "Does the complex susceptibility of the Hénon map have a pole in the upper-half plane ? A numerical investigation.", Nonlinearity, 20, 2883-2895 (2007).
- Samuelides M., Cessac B., "Random recurrent neural networks dynamics.", EPJ Special Topics "Topics in Dynamical Neural Networks : From Large Scale Neural Networks to Motor Control and Vision", Vol. 142, Num. 1, 7-88, (2007).
- Cessac B., Samuelides M., "From Neuron to Neural Networks dynamics. ", EPJ Special Topics "Topics in Dynamical Neural Networks : From Large Scale Neural Networks to Motor Control and Vision", Vol. 142, Num. 1, 89-122, (2007).
- Cessac B., Dauce E., Perrinet L., Samuelides M., ``Topics in dynamical neural networks - From large scale neural networks to motor control and vision – Introduction'', EPJ Special Topics, Vol. 142, Num 1,1-5, (2007).
- Cessac B., Sepulchre J.A., "Linear Response in a class of simple systems far from equilibrium". , Physica D, Volume 225, Issue 1 , 13-28 (2006).
- Dauce E., Quoy M., Cessac B., Doyon B. and Samuelides M. "Self-Organization and Dynamics reduction in recurrent networks: stimulus presentation andlearning", Neural Networks, (11), 521-533, (1998).
- Samuelides M., Doyon B., Cessac B., Quoy M. "Spontaneous dynamics and associative learning in an asymmetric recurrent neural network", Math. of Neural Networks, 312-317, (1996).
- Cessac B., "Increase in complexity in random neural networks", J. de Physique I (France), 5, 409-432, (1995).
- Cessac B., "Occurence of chaos and AT line in random neural networks", Europhys. Let., 26 (8), 577-582, (1994).
- Cessac B., "Absolute Stability criteria for random asymmetric neural networks", J. of Physics A, 27, L927-L930, (1994).
- Cessac B., Doyon B., Quoy M., Samuelides M. "Mean-field equations, bifurcation map and route to chaos in discrete time neural networks", Physica D, 74, 24-44(1994).
- Doyon B., Cessac B., Quoy M., Samuelides M. "On bifurcations and chaos in random neural networks", Acta Biotheoretica., Vol. 42, Num. 2/3, 215-225,(1994).
- Doyon B., Cessac B., Quoy M., Samuelides M. "Chaos in Neural Networks With Random Connectivity", International Journal Of Bifurcation and Chaos, Vol. 3, Num. 2, 279-291 (1993).
- Quoy M., Cessac B., Doyon B., Samuelides M. "Dynamical behaviour of neural networks with discrete time dynamics", Neural Network World, Vol. 3, Num. 6, 845-848 (1993).

Proceedings in International peer conferences.

- Taouali W., Cessac B., A maximum likelihood
estimator of neural network synaptic weights

Twenty Second Annual Computational Neuroscience Meeting : CNS 2013 14 (2013). - Muratori M., Cessac B., Beyond dynamical mean-field
theory of neural networks Twenty Second Annual
Computational Neuroscience Meeting : CNS 2013 14

- Cofre R., Cessac B. Dynamics and spike trains
statistics in conductance-based Integrate-and-Fire
neural networks with chemical and electric synapses

Twenty Second Annual Computational Neuroscience Meeting : CNS 2013 14 (2013)

- Nasser H., Kraria S., Cessac B. EnaS: a new software for neural population analysis in large scale spiking networks Twenty Second Annual Computational Neuroscience Meeting : CNS 2013 14 (2013)
- Cofré R., Cessac B, Dynamics and spike trains
statistics in conductance-based Integrate-and-Fire
neural networks with chemical and electric synapses

AREADNE 2012. Encoding And Decoding of Neural Ensembles (2012). - J.C. Vasquez, B. Cessac, and T. Viéville., "New approaches to spike train analysis and neuronal coding", CNS-2011 workshop on 27/28 Jul 2011.
- J.C. Vasquez, B. Cessac, and T. Viéville.
*Entropy-based parametric estimation of spike train statistics*Statistical Mechanics of Learning and Inference, Stockholm-Mariehanm, May 2010. - T. Viéville, B. Cessac, "Parametric Estimation of spike train statistics", CNS 09 Berlin.
- H. Rostro-Gonzalez, B. Cessac, J.C. Vasquez and T.
Vieville.
*Back-engineering in spiking neural networks parameters.*Eighteenth Annual Cmputational Neuroscience Meeting CNS 2009. July 18th-23rd 2009, Berlin, Germany. BMC Neuroscience 2009, 10(Suppl.10):P289, BioMed Central. - J. C. Vasquez, B. Cessac, H. Rostro-Gonzalez, T. Viéville, "How Gibbs Distributions may naturally arise from synaptic adaptation mechanism", CNS 09, Berlin.
- Faugeras O., Touboul J., Cessac B., “A constructive mean-field analysis of multi population neural networks with random synaptic weights”, COSYNE 09.
- Siri, B., Berry, H., Cessac, B., Delord, B. and Quoy, M., ``Local learning rules and bifurcations in the global dynamics of random recurrent neural networks''. European Conference on Complex Systems (ECCS'07), October, Dresden, Germany, (2007).
- B. Cessac, Thierry Viéville, ``Revisiting time discretisation of spiking network models'', from Sixteenth Annual Computational Neuroscience Meeting : CNS*2007 Toronto, Canada. 7-12 July 2007.-BMC Neuroscience 2007, 8(Suppl 2) :P76 doi:10.1186/1471-2202-8-S2-P76.
- Siri, B., Berry, H., Cessac, B., Delord, B. and Quoy, M. ``Topological and dynamical structures induced by Hebbian learning in random neural networks''. In International Conference on Complex Systems, ICCS 2006, Boston, MA, USA, June 2006.
- B. Cessac, O. Mazet, M. Samuelides, H. Soula, "Mean field theory for random recurrent spiking neural networks", NOLTA'05 (Non Linear Theory and its Applications) October 18-21, 2005, Brugge, Belgium.
- Cessac B., Blanchard Ph., Volchenkov D.,``Does renormaisation group help very much in Self-Organized criticality '', Proceedings of ``The science of complexity: from mathématics to technology to a sustainable world '', Bielefeld 2002.
- Doyon B., Cessac B., Quoy M., Samuelides M., "Destabilization and route to chaos in neural networks with random connectivity", Neural Information and Processing Systems: Natural and Synthetics, (1992).

Proceedings in French conferences.

- Bruno Cessac, Hassan Nasser, Juan-Carlos Vasquez, Spike trains statistics in Integrate and Fire Models: exact result, NeuroComp2010 (Lyon).
- J.C. Vasquez, Hassan Nasser, Adrian Palacios, Bruno
Cessac, Thierry Vieville and Horacio Rostro-Gonzalez.
*Parametric estimation of Spike train statistics by Gibbs distributions : an application to bio-inspired and experimetal data.*Neurocomp 2010 (Lyon). - B. Cessac, H. Rostro, J.C. Vasquez, T. Viéville, "Statistics of spikes trains, synaptic plasticity and Gibbs distributions", proceedings of the conference NeuroComp 2008 (Marseille).
- B. Cessac, H. Rostro, J.C. Vasquez, T. Viéville, "To which extend is the ``neural code'' a metric ?", proceedings of the conference NeuroComp 2008 (Marseille).
- Siri, B., Berry, H., Cessac, B., Delord, B., Quoy, M., and Temam, O. , ``Learning-induced topological effects on dynamics in neural networks''. In NeuroComp'06:206--209, Pont-à-Mousson, France, 23-24 October 2006.
- Cessac B., Blanchard Ph., Volchenkov D.,``Does renormaisation group help very much in Self-Organized criticality '', Proceedings of ``The science of complexity: from mathématics to technology to a sustainable world '', Bielefeld 2002.
- Cessac B., "Some fractal aspects of Self-Organized Criticality.". Proceedings du colloque "Fractales en progrès" , en l'honneur du 80ème anniversaire de Benoit Mandelbrot,12-13 Novembre 2004.
- Cessac B., Blanchard Ph., Krüger T., "A dynamical system approach to Self-Organized Criticality", "Proceedings of the International Conference on Mathematical Results in Statistical Mechanics", 27-31 Juillet 1998, Marseille.

- B. CESSAC, A. PALACIOS. Spike train statistics from empirical facts to theory: the case of the retina, in "Modeling in Computational Biology and Biomedicine: A Multidisciplinary Endeavor", F. CAZALS, P. KORNPROBST (editors), Lectures Notes in Mathematical and Computational Biology (LNMCB), Springer-Verlag, 2012.

- B.
Cessac,
H.
Berry,
"Du
chaos
dans
les
neurones",
Pour
la
Science,
Novembre
2009.

- B. Cessac, T. Viéville, C. Leininger, "Le cerveau est-il un bon modèle de réseau de neurones ?" , "Interstices", 11-07, (2007).