• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Магистратура 2024/2025

Введение в глубинное обучение

Статус: Курс обязательный
Направление: 01.04.02. Прикладная математика и информатика
Когда читается: 2-й курс, 2 модуль
Формат изучения: с онлайн-курсом
Онлайн-часы: 60
Охват аудитории: для своего кампуса
Прогр. обучения: Магистр по наукам о данных (о)
Язык: английский
Кредиты: 3
Контактные часы: 12

Course Syllabus

Abstract

In this course, you will learn the basics of deep learning. We’ll start with a recap of linear models and discuss stochastic optimization methods that are crucial for training deep neural networks. Then you will study all popular building blocks of neural networks including fully connected layers, convolutional and recurrent layers. You will use these building blocks to define complex modern architectures in PyTorch framework. In the course project you will implement a deep neural network for the task of image captioning which solves the problem of giving a text description for an input image.
Learning Objectives

Learning Objectives

  • Understand linear regression: mean squared error, analytical solution.
  • Understand logistic regression: model, cross-entropy loss, class probability estimation.
  • Understand gradient descent for linear models. Derivatives of MSE and cross-entropy loss functions.
  • Understand the problem of overfitting.
  • Understand regularization for linear models.
Expected Learning Outcomes

Expected Learning Outcomes

  • Use linear models for classification and regression tasks.
  • Apply regularization to train better models.
  • Tune SGD optimization using different techniques.
  • Train a linear model for classification or regression task using stochastic gradient descent.
  • Explain the mechanics of basic building blocks for neural networks.
  • Apply backpropagation algorithm to train deep neural networks using automatic differentiation.
  • Implement, train and test neural networks using PyTorch.
  • Understand building blocks and training tricks of modern CNNs.
  • Define and train a CNN from scratch.
  • Use pre-trained CNN to solve a new task.
  • Understand what is unsupervised learning and how you can benefit from it.
  • Implement and train deep autoencoders.
  • Apply autoencoders for image retrieval and image morphing.
  • Understand basics of unsupervised learning of word embeddings.
  • Implement and train generative adversarial networks.
  • Define and train an RNN from scratch.
  • Understand modern architectures of RNNs: LSTM, GRU.
  • Use RNNs for different types of tasks: sequential input, sequential output, sequential input and output.
  • Apply your skills to train an Image Captioning model.
Course Contents

Course Contents

  • 1. Introduction to optimization
  • 2. Introduction to neural networks
  • 3. Deep Learning for images
  • 4. Unsupervised representation learning
  • 5. Deep learning for sequences
  • 6. Final Project
Assessment Elements

Assessment Elements

  • non-blocking Programming Assignments
    Weekly programming assignments.
  • non-blocking Quizzes
    Weekly quizzez.
  • non-blocking Staff Graded Assignmnets
Interim Assessment

Interim Assessment

  • 2024/2025 2nd module
    0.5 * Programming Assignments + 0.3 * Quizzes + 0.2 * Staff Graded Assignmnets
Bibliography

Bibliography

Recommended Core Bibliography

  • Ian Goodfellow and Yoshua Bengio and Aaron Courville. Deep Learning, 2016. URL: http://www.deeplearningbook.org

Recommended Additional Bibliography

  • Hastie, T., Tibshirani, R., & Friedman, J. H. (2009). The Elements of Statistical Learning : Data Mining, Inference, and Prediction (Vol. Second edition, corrected 7th printing). New York: Springer. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=277008

Authors

  • Боднарук Иван Иванович