Wouter M. Koolen | MetaGrad: Multiple Learning Rates in Online Learning
- https://wsc.project.cwi.nl/ml-reading-group/events/wouter-m-koolen-metagrad-multiple-learning-rates-in-online-learning
- Wouter M. Koolen | MetaGrad: Multiple Learning Rates in Online Learning
- 2016-11-28T14:00:00+01:00
- 2016-11-28T15:00:00+01:00
- This is a NIPS oral sneak preview / practice talk. It will follow the NIPS format of 15 minutes presentation plus 5 minutes questions. Your feedback will be very welcome.
- When Nov 28, 2016 from 02:00 PM to 03:00 PM (Europe/Amsterdam / UTC100)
- Where L016
- Add event to calendar iCal
In online convex optimization it is well known that certain subclasses of objective functions are much easier than arbitrary convex functions. We are interested in designing adaptive methods that can automatically get fast rates in as many such subclasses as possible, without any manual tuning. Previous adaptive methods are able to interpolate between strongly convex and general convex functions. We present a new method, MetaGrad, that adapts to a much broader class of functions, including exp-concave and strongly convex functions, but also various types of stochastic and non-stochastic functions without any curvature. For instance, MetaGrad can achieve logarithmic regret on the unregularized hinge loss, even though it has no curvature, if the data come from a favourable probability distribution. MetaGrad's main feature is that it simultaneously considers multiple learning rates. Unlike all previous methods with provable regret guarantees, however, its learning rates are not monotonically decreasing over time and are not tuned based on a theoretically derived bound on the regret. Instead, they are weighted directly proportional to their empirical performance on the data using a tilted exponential weights master algorithm.