Tuesday, 12th of March

14h30 (room R2014, 660 building) (see location )

Alexis Dubreuil

(Group for Neural Theory, ENS)

Title: Reverse-engineering of low-rank recurrent neural networks.


Recurrent neural networks (RNN) are used in a variety of machine learning tasks, such as language modeling or to deal with partial observability in reinforcement learning problems. In these approaches RNN are used as black-boxes, and an important challenge is to understand how these machines process their inputs to deliver task relevant outputs. This is a difficult problem as these machines are high-dimensional non-linear systems. To overcome this difficulty, we focus on RNN whose recurrent matrix is low-rank, which allows us to use recent theoretical results describing dynamical properties of this type of networks. I will present three example tasks, inspired from neuroscience experiments, on which we train and reverse-engineer low-rank RNN, and which allows us to propose mechanisms for the implementation of working memory, estimation of noisy inputs and context-dependent rule switching. I will also briefly mention some work towards showing that this approach can be useful to understand the behavior of generic (non low-rank) trained RNN.

Contact: guillaume.charpiat at inria.fr
All TAU seminars: here

Contributors to this page: guillaume .
Page last modified on Thursday 07 of March, 2019 14:48:41 CET by guillaume.