• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
Bachelor 2021/2022

Statistical Learning Theory

Category 'Best Course for New Knowledge and Skills'
Area of studies: Applied Mathematics and Information Science
When: 4 year, 1, 2 module
Mode of studies: offline
Open to: students of one campus
Language: English
ECTS credits: 5
Contact hours: 54

Course Syllabus

Abstract

This course studies mathematical explanations for the learning ability of important machine learning algorithms such as support vector machines, AdaBoost, and over parameterized neural nets. The course has 3 parts. The first is about "online learning". Important ideas and techniques are introduced in a simpler probability free setting, such as margins and bias-complexity trade-off. In the lectures about multi-armed bandits, probability theory is reviewed, (also this is important in reinforcement learning). The 2nd part, present the main definitions (VC-dimension and Rademacher complexity), and prove the main risk bounds, (using various measure concentration results). In the 3rd part, the theory is used to derive margin risk bounds. This is applied to SVM-classification, AdaBoost, and implicit regularization in over parameterized neural nets. Neural tangent kernels are introduced and the connection with kernel SVM is explained. See HTTP://wiki.cs.hse.ru/Statistical_learning_theory_2023 for the full lecture notes.
Learning Objectives

Learning Objectives

  • Understand theoretical explanations for the success of drop-out regularization in neural nets.
  • Know various measures of model capacity, such as VC-dimension and Demarche complexity. Use them to derive risk bounds and bounds on sample complexity. Both in the realizable and the agnostic setting.
  • Apply important results from measure concentration, such as the Chernozem bound, Holding's bound, and the bounded differences inequality.
  • Prove upper and lower bounds on the number of mistakes made by basic algorithms in online learning, such as the weighted majority algorithm, perception algorithms. Similar for learning with expert advice and multi-armed bandits.
Course Contents

Course Contents

  • Probably approximately correct learning
  • VC-dimensions
  • Structural risk minimization and variants
  • The time complexity of learning and cryptography
  • Boosting
  • Rademacher complexities
  • Support vector machines and margin theory
  • Mutliclass classification and DeepBoost
  • Online learning
  • Reinforcement learning
Assessment Elements

Assessment Elements

  • non-blocking Homework
  • non-blocking Colloqium
  • non-blocking Exam
Interim Assessment

Interim Assessment

  • 2021/2022 2nd module
    0.3 * Colloqium + 0.35 * Exam + 0.35 * Homework

Authors

  • BAUVENS Bruno Frederik L
  • LUKIANENKO NIKITA SERGEEVICH
  • PODOLSKIY VLADIMIR VLADIMIROVICH