Événements

Working Group on risk

On the global convergence of gradient descent for non convex machine learning problems

Partager sur :

Cette conférence sera animée par le Docteur Francis BACH de l'INRIA.

 

Abstract

Many tasks in machine learning and signal processing can be solved by minimizing a convex function of a measure. This includes sparse spikes deconvolution or training a neural network with a single hidden layer. For these problems, we study a simple minimization method: the unknown measure is discretized into a mixture of particles and a continuous-time gradient descent is performed on their weights and positions. This is an idealization of the usual way to train neural networks with a large hidden layer. We show that, when initialized correctly and in the many-particle limit, this gradient flow, although non-convex, converges to global minimizers. The proof involves Wasserstein gradient flows, a by-product of optimal transport theory. Numerical experiments show that this asymptotic behavior is already at play for a reasonable number of particles, even in high dimension. (Joint work with Lénaïc Chizat)

Lundi 17 juin 2019 12h30 - 13h30
ESSEC
campus La Défense (CNIT) - Salle 202
La Défense
  • Gratuit  

Localisation

ESSEC

campus La Défense (CNIT) - Salle 202

Vous devez être connecté pour laisser un commentaire.

Lundi 17 juin 2019 12h30 - 13h30
ESSEC
campus La Défense (CNIT) - Salle 202
La Défense
  • Gratuit  

  • Ajouter à mon agenda