Dear all,
This Friday we have Tomer Koren from Tel Aviv University in the thematic seminar:
*Tomer Koren *(Tel Aviv University, https://tomerkoren.github.io/)
*Friday March 11*, 16h00-17h00 Online on Zoom: https://uva-live.zoom.us/j/85690850169 Meeting ID: 856 9085 0169
Please also join for online drinks after the talk. * Benign Underfitting of Stochastic Gradient Descent*
We study to what extent may stochastic gradient descent (SGD) be understood as a ``conventional'' learning rule that achieves generalization performance by obtaining a good fit to training data. We consider the fundamental stochastic convex optimization framework, where SGD is classically known to minimize the population risk at an optimal rate, and prove that, surprisingly, there exist problem instances where the SGD solution exhibits both empirical risk and generalization gap lower bounded by a universal constant. Consequently, it turns out that SGD is not algorithmically stable in any sense, and its generalization ability cannot be explained by uniform convergence or any other currently known generalization bound technique for that matter (other than that of its classical analysis). Time permitting, we will discuss related results for with-replacement SGD, multi-epoch SGD, and full-batch gradient descent.
Based on joint works with Idan Amir, Roi Livni, Yishay Mansour and Uri Sherman.
Seminar organizers: Tim van Erven Botond Szabo
https://mschauer.github.io/StructuresSeminar/
*Upcoming talks:
*Mar. 25, *Nicolò Cesa-Bianchi https://mschauer.github.io/StructuresSeminar/#CesaBianchi**, *Università degli Studi di Milano Apr. 22, *Tor Lattimore https://mschauer.github.io/StructuresSeminar/#Lattimore*, DeepMind Jun. 10,***Julia Olkhovskaya https://sites.google.com/view/julia-olkhovskaya/home*, Vrije Universiteit
Hi everyone,
Reminder: today's seminar is starting in a few minutes.
Best regards, Tim
On 08/03/2022 13:16, Tim van Erven via stoch-ned-l wrote:
Dear all,
This Friday we have Tomer Koren from Tel Aviv University in the thematic seminar:
*Tomer Koren *(Tel Aviv University, https://tomerkoren.github.io/)
*Friday March 11*, 16h00-17h00 Online on Zoom: https://uva-live.zoom.us/j/85690850169 Meeting ID: 856 9085 0169
Please also join for online drinks after the talk.
Benign Underfitting of Stochastic Gradient Descent*
We study to what extent may stochastic gradient descent (SGD) be understood as a ``conventional'' learning rule that achieves generalization performance by obtaining a good fit to training data. We consider the fundamental stochastic convex optimization framework, where SGD is classically known to minimize the population risk at an optimal rate, and prove that, surprisingly, there exist problem instances where the SGD solution exhibits both empirical risk and generalization gap lower bounded by a universal constant. Consequently, it turns out that SGD is not algorithmically stable in any sense, and its generalization ability cannot be explained by uniform convergence or any other currently known generalization bound technique for that matter (other than that of its classical analysis). Time permitting, we will discuss related results for with-replacement SGD, multi-epoch SGD, and full-batch gradient descent.
Based on joint works with Idan Amir, Roi Livni, Yishay Mansour and Uri Sherman.
Seminar organizers: Tim van Erven Botond Szabo
https://mschauer.github.io/StructuresSeminar/
*Upcoming talks:
*Mar. 25, *Nicolò Cesa-Bianchi https://mschauer.github.io/StructuresSeminar/#CesaBianchi**, *Università degli Studi di Milano Apr. 22, *Tor Lattimore https://mschauer.github.io/StructuresSeminar/#Lattimore*, DeepMind Jun. 10,***Julia Olkhovskaya https://sites.google.com/view/julia-olkhovskaya/home*, Vrije Universiteit -- Tim van Erventim@timvanerven.nl www.timvanerven.nl
machine-learning-nederland@list.uva.nl