Wouter Koolen | MetaGrad: Online Convex Optimization in Individual-Sequence and Stochastic Settings
This talk will be about MetaGrad, a new algorithm for Online Convex Optimization.
- https://wsc.project.cwi.nl/ml-reading-group/events/wouter-koolen-metagrad-online-convex-optimization-in-individual-sequence-and-stochastic-settings
- Wouter Koolen | MetaGrad: Online Convex Optimization in Individual-Sequence and Stochastic Settings
- 2016-06-16T11:00:00+02:00
- 2016-06-16T13:00:00+02:00
- This talk will be about MetaGrad, a new algorithm for Online Convex Optimization.
- When Jun 16, 2016 from 11:00 AM to 01:00 PM (Europe/Amsterdam / UTC200)
- Where L236
- Add event to calendar iCal
We will start by reviewing the Online Convex Optimization problem, the Online Gradient Descent strategy and its performance guarantees. We will then look at the design of a new algorithm, called MetaGrad, which is based on the aggregation of multiple learning rates. We will show that MetaGrad has individual-sequence performance guarantees that express that it may often outperform the worst-case lower bound. And we will conclude by showing that in stochastic settings these regret guarantees lead to adaptivity to a certain friendliness parameter (the Bernstein exponent) of the generating distribution.
MetaGrad paper: http://arxiv.org/abs/arXiv:1604.08740
Stochastic fast rates paper: http://arxiv.org/abs/arXiv:1605.06439