Lade Veranstaltungen

Diese Veranstaltung hat bereits stattgefunden.

Votrag in Rahmen des Doctoral Seminar zum Thema: Acceleration in Stochastic Optimisation

| |
Veranstaltungskategorie Vortrag

Veranstaltungsort
I.2.35

Veranstalter
Institut für Mathematik


Beschreibung

Stochastic gradient descent (SGD) is the engine beneath the hood of many
machine learning algorithms, but research into its acceleration is still in its
infancy. It has long been known that Nesterov’s accelerated gradient descent
achieves the optimal convergence rate in the deterministic setting, but such
methods are difficult to apply to stochastic gradient descent. In this talk,
we will discuss existing accelerated SGD algorithms, tricks to achieve the
optimal convergence rate, and ongoing research into acceleration methods
for SGD.

Vortragende(r)
Derek Driggs (University of Cambridge)

Kontakt
Senka Haznadar (senka [dot] haznadar [at] aau [dot] at)