CS 281B / Stat 241B, Spring 2014:

Statistical Learning Theory

Syllabus


Course description

This course will provide an introduction to the theoretical analysis of prediction methods, focusing on statistical and computational aspects. It will cover approaches such as kernel methods and boosting algorithms, and probabilistic and game theoretic formulations of prediction problems, and it will focus on tools for the theoretical analysis of the performance of learning algorithms and the inherent difficulty of learning problems.
Prerequisites: CS281A/Stat241A, or advanced training in probability or statistics, for example at the level of Stat 205A or Stat 210A.

Outline:

  • Probabilistic and game-theoretic formulations of prediction problems
  • Risk Bounds
    Overfitting
    Uniform convergence
    Concentration inequalities
    Finite classes
    Rademacher averages
    Vapnik-Chervonenkis dimension
    Covering numbers
  • Model selection
    Approximation-estimation trade-off
    Method of sieves, Regularization
    Oracle inequalities
  • Online prediction
    Mistake bounds: halving, weighted majority
    Prediction with expert advice
    Online optimization
    Potential function methods
    Log loss; Bayesian methods
    Portfolio selection
  • Kernel methods
    Perceptron algorithm
    Support vector machines
    Constrained optimization, duality
    Hinge loss
    Reproducing kernel Hilbert spaces
    Representer theorem
    Kernel methods for regression
  • AdaBoost
    Optimization
    Margins analysis
    Logistic regression



  • Back to course home page