Chargement...
 

Historique: Seminar12022019

Aperçu de cette version: 4

Tuesday, 12th of Mars

14h30 (room R2014, 660 building) (see location)

Alexis Dubreuil

(Group for Neural Theory, ENS)

Title: Reverse-engineering of low-rank recurrent neural networks.


Abstract

Recurrent neural networks (RNN) are used in a variety of machine learning tasks, such as language modeling or to deal with partial observability in reinforcement learning problems. In these approaches RNN are used as black-boxes, and an important challenge is to understand how these machines process their inputs to deliver task relevant outputs. This is a difficult problem as these machines are high-dimensional non-linear systems. To overcome this difficulty, we focus on RNN whose recurrent matrix is low-rank, which allows us to use recent theoretical results describing dynamical properties of this type of networks. I will present three example tasks, inspired from neuroscience experiments, on which we train and reverse-engineer low-rank RNN, and which allows us to propose mechanisms for the implementation of working memory, estimation of noisy inputs and context-dependent rule switching. I will also briefly mention some work towards showing that this approach can be useful to understand the behavior of generic (non low-rank) trained RNN.



Contact: guillaume.charpiat at inria.fr
All TAU seminars: here

Historique

Avancé
Information Version
jeu. 07 de Mar, 2019 14h48 guillaume from 129.175.15.11 5
Afficher
jeu. 07 de Mar, 2019 14h37 guillaume from 129.175.15.11 4
Afficher
jeu. 07 de Mar, 2019 14h37 guillaume from 129.175.15.11 3
Afficher
lun. 04 de Feb, 2019 10h29 guillaume from 129.175.15.11 2
Afficher
mer. 09 de Jan, 2019 23h20 guillaume from 129.175.15.11 1
Afficher