Papers:
- Ideomotor feedback control in a recurrent neural network.
Mathieu N. Galtier, 2014.
Submitted. Link to preprint.
- A local Echo State Property through the largest Lyapunov Exponent.
Mathieu N. Galtier and Gilles Wainrib, 2014.
Submitted. Link to preprint.
- Regular graphs increase variability in neural networks.
Gilles Wainrib and Mathieu N. Galtier, 2014.
Submitted. Link to preprint.
- Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes.
Mathieu N. Galtier, Camille Marini, Gilles Wainrib and Herbert Jaeger, 2014.
Accepted in Neural Networks. Link to preprint.
- Macroscopic Equations Governing Noisy Spiking Neuronal Populations with Linear Synapses.
Mathieu N. Galtier and Jonathan Touboul, 2013.
PloS one, 8(11), e78917, 2013. Link to paper.
- A Biological Gradient Descent for Prediction Through a Combination of STDP and Homeostatic Plasticity.
Mathieu N. Galtier and Gilles Wainrib, 2013.
Neural Computation 25(11), 2815-2832. Link to preprint.
- Multi-scale analysis of slow-fast neuronal learning models with noise.
Mathieu N. Galtier and Gilles Wainrib, 2013.
Journal of Mathematical Neuroscience, 2, 13. Link to paper.
- Hebbian learning of recurrent connections: a geometrical perspective.
Mathieu N. Galtier, Olivier D. Faugeras, Paul C. Bressloff, 2012.
Neural Computation, 24(9), 2346-2383. Link to preprint.
- On an explicit representation of the solution of linear stochastic partial differential equations with delays.
Mathieu N. Galtier and Jonathan D. Touboul, 2012.
C. R. Acad. Sci. Paris, Ser. I, Link to preprint.
PhD thesis:
- A mathematical approach to unsupervised learning in recurrent neural networks, 2012 link to manuscript