CS 281B / Stat 241B, Spring 2014:

Statistical Learning Theory


Office hours
Instructor Peter Bartlett bartlett at cs Mon 11:00-12:00, Thu 2:00-3:00; Evans 399.
Instructor Wouter Koolen wmkoolen at gmail.com Just drop by (or schedule an appointment); Evans 418.
Alan Malek
malek at berkeley Mon 5:00-6:00, Fri 4:00-5:00, Soda 411.

Lectures:  Evans 334. Tuesday/Thursday 12:30-2:00.

Course description

This course will provide an introduction to the theoretical analysis of prediction methods, focusing on statistical and computational aspects. It will cover approaches such as kernel methods and boosting algorithms, and probabilistic and game theoretic formulations of prediction problems, and it will focus on tools for the theoretical analysis of the performance of learning algorithms and the inherent difficulty of learning problems.
Prerequisites: CS281A/Stat241A, or advanced training in probability or statistics, at the level of Stat 205A or Stat 210A.
[More details]


The grade will be based 50% on homework and 50% on the final project.

There will be five homework assignments, approximately one every two weeks. Late homeworks will not be accepted. The homework grade will be that of the best four of five homeworks. You are welcome to discuss homeworks with other students, but please work out and write up the solutions completely on your own, and specify in your solutions who you've discussed which problems with. Some of the problems have appeared in the literature. Please attempt them yourself, and if you need help, ask the instructor or GSI for assistance, rather than searching for someone else's solution. If you happen to have seen a problem before, please write up a solution on your own (and indicate that you've seen it before - it would also be helpful to point out where).

There will be a final project. This can be in any area related to the topics of the course. You might extend a theoretical result, develop a new method and investigate its performance, or run experiments on an existing method for a particular application, or do a combination of these. You will need to submit a brief written report and give a presentation in class on Thursday, May 1. It is OK to work on projects in groups of two (please email me an explanation if there's a good reason to work in a larger group). In all cases you will need to write the report individually.
Project proposals are due on March 13 (please send one or two plain text paragraphs in an email message to bartlett at cs).
Project reports are due on May 2. Please email a pdf file to bartlett at cs.





Tue, Jan 21 Organizational issues. Course outline. Probabilistic formulations of prediction problems.
Thu, Jan 23 Plug-in estimators. Linear threshold functions. Perceptron algorithm.
Tue, Jan 28 Minimax risk. Bounds for linear threshold functions.
Thu, Jan 30 Concentration inequalities.
Tue, Feb 4 Concentration inequalities.
Thu, Feb 6 Concentration inequalities. Uniform laws of large numbers.
Tue, Feb 11 Uniform laws of large numbers.
Thu, Feb 13 Vapnik-Chervonenkis dimension.
Tue, Feb 18 Online learning (Wouter Koolen presenting): Mix loss. Dot loss.
Thu, Feb 20 Minimax with dot-loss. Follow the perturbed leader.
Tue, Feb 25 Follow the perturbed leader. Adaptive regret and tracking.
Thu, Feb 27 Normalized maximum likelihood. Universal portfolios.
Tue, Mar 4 Universal portfolios, Context tree weighting.
Thu, Mar 6 Reductions: mixability, gradient trick, specialists.
Tue, Mar 11 Online convex optimization.
Thu, Mar 13 Online convex optimization: Regularization.
Tue, Mar 18 Online convex optimization: Regret bounds.
Thu, Mar 20 Optimal regret.
Tue, Mar 25 Spring
Thu, Mar 27Break
Tue, Apr 1 Kernel methods.
Thu, Apr 3 Kernels, RKHSs, Mercer's Theorem.
Tue, Apr 8 Hard margin SVMs, optimization.
Thu, Apr 10 Soft margin SVMs, representer theorem.
Tue, Apr 15 Risk/regret bounds for SVMs.
Thu, Apr 17 Kernel regression. Convex losses for classification.
Tue, Apr 22 AdaBoost.
Thu, Apr 24 AdaBoost as I-projection.
Tue, Apr 29 Convergence of AdaBoost. Model selection, complexity regularization, consistency of AdaBoost.
Thu, May 1 Final project presentations.