One step towards an abstract view of computation in spiking neural-networks.
Neural network information is mainly conveyed through (i) event-based quanta, spikes, whereas high-level representation of
the related processing is almost always
modeled in (ii) some continuous framework. Here, we propose a link between (i) and (ii) which allows to derive the spiking
network parameters given a continuous
processing and also obtain an abstract interpretation of the related processing.
In event based neural network models, the output of a neuron is entirely characterized by the sequence of spike firing times
and
the Gerstner and Kistler Spike Response Model of a biological neuron defines the state of a neuron via a single variable.
At the computational level, using piece-wise linear profiles yields a closed-form calculation of the spiking events, thus
allows to obtain an efficient
and exact implementation of (1) in event-based massive neuronal simulators such as MVASPIKE.
Using this model and following Maas and Natchslager, we represent the signal as the last spike delay with respect to a given
temporal reference,
consider piece-wise linear response profiles (as approximations of Hodhkins-Huxley related profiles), introduce a temporal
discretization of the input current,
and obtain a direct link with continuous representation of neural map computation:
- the resistive coefficient being proportional to the spiking threshold
- the variational approach diffusion being in direct relation with the synaptic weights
- the corrective term being controlled by the axonal delay
- the input gain being controlled by the input resistance
with closed-form correspondence allowing to explicitly calculate the neural network parameters given an abstract continuous
representation.
This relationship is valid only in a given temporal window, with saturation outside, as for analog networks. Here it appears
that fast adaptive delays (as observed
in recent intra-cellular experiments of e.g. Fregnac et al.) is a crucial element in this model.
In the derivation a constraint coherent with S.T.D.P. adaptation rules (yielding the same constraint) as derived by, e.g.,
Guyonneau.
It also corresponds to what is obtained from a variational framework relating the neuronal weights to a continuous diffusion
operator,
as introduced by Cottet. This last formulation is in direct relation with a sub-class of Cohen-Grossgerg dynamical systems.
We illustrate the previous derivation with an event-based implementation of an early-vision processing layer,
for a 1D spiking neural network, correspond to an edge-preserving smoothing of the input, using a non-linear diffusion operator.
key-words. Cortical maps; Diffusion operator; Visual function; Spiking neural networks.
Co-authors. Léonard Gérard, Pierre Kornprobst, Olivier Rochel, et al.