Neuromorphic Algorithms

Learning algorithms inspired by the brain rely on local calculations, that stem from synaptic and neural dynamics. These methods are more suited for physical artificial neural networks than traditional mathematical techniques, such as backpropagation of the gradient, commonly used in software networks. However, they face challenges in achieving high accuracy on complex tasks.

We are devising new learning algorithms aiming to combine the precision of backpropagation with the hardware compatibility of brain-inspired techniques. These algorithms leverage physical phenomena, such as relaxation towards a minimum energy state, or transient dynamics to recognize and classify data, either supervised or unsupervised.

The hardware neural networks we develop, based on spintronics, oxides, or quantum mechanics, serve as natural platforms to implement these algorithms through the physics and dynamics of the components. While maintaining algorithmic precision for complex tasks is challenging, given the noisy and varied characteristics of nano-components, we believe it’s achievable: after all, the brain does it. Inspired by this biological feat, we seek solutions that combine high performance, fault resilience, and low energy consumption.

Key results and publications:

E. Martin, M. Ernoult, J. Laydevant, S. Li, D. Querlioz, T. Petrisor, and Julie Grollier, EqSpike: Spike-driven equilibrium propagation for neuromorphic implementations, iScience 24, 3 (2021)

J. Laydevant, M. Ernoult, D. Querlioz, J. Grollier, Training Dynamical Binary Neural Networks with Equilibrium Propagation, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2021)

M. Ernoult, J. Grollier, D. Querlioz, Y. Bengio, B. Scellier, Equilibrium Propagation with Continual Weight Updates, [arXiv] (2020)