• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Магистратура 2024/2025

Введение в глубинное обучение

Статус: Курс обязательный (Магистр по наукам о данных)
Направление: 01.04.02. Прикладная математика и информатика
Когда читается: 2-й курс, 2 модуль
Формат изучения: с онлайн-курсом
Онлайн-часы: 60
Охват аудитории: для своего кампуса
Прогр. обучения: Магистр по наукам о данных (о)
Язык: английский
Кредиты: 3

Course Syllabus

Abstract

In this course we discuss a variety of Deep Learning scientific papers and practice understanding, presenting and reviewing them. We cover papers in different areas of DL such as Natural Language Processing and Computer vision and move from basic to more complicated ideas and more recent developments. The course programme is different for two “tracks”: the deep learning one (for students who have taken the introduction to deep learning course) and the math track. The deep learning track covers more complex papers which require the understanding of basic neural network architectures and traning process. The math track focuses on the basic consepts and tries to introduce them on examples of fundamental papers and overviewing blog posts.
Learning Objectives

Learning Objectives

  • to be able to critically read and profoundly understand a scientific paper
  • to write a peer review
  • to be able to make a scientific presentation
Expected Learning Outcomes

Expected Learning Outcomes

  • Form an understanding of the core tools of the course
  • Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the NeRF / Word2Vec overview
  • Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the Attention Is All you need / Model-agnostic meta-learning for fast adaptation of deep networks
  • Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the CLIP / GAN
  • Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the Typical Decoding / NeRF
  • Gained experience in learning, presenting, reviewing and discussing papers, a deep understanding of the Robustness May be at Odds with Accuracy
Course Contents

Course Contents

  • Introduction
  • Papers: NeRF / Word2Vec
  • Papers: Attention Is All you need / Model-agnostic meta-learning for fast adaptation of deep networks
  • Papers: Typical decoding / GAN
  • Papers: How Can We Know What Language Models Know? / NeRF
  • Papers: Robustness May be at Odds with Accuracy
Assessment Elements

Assessment Elements

  • non-blocking Average test mark
  • non-blocking Average review mark
    The review mark is averaged along all weeks when the student writes a review. The penalties for submitting the review after the deadline are progressive: 0.2 is the base penalty for submitting late. Every 24 hours it increases by 0.1 until the review is submitted. So for the review deadline on Thursday at 23:59, submitting it on Friday at 00:01 would be fined 0.2 of the mark, on Saturday 0.3 of the mark, on Sunday 0.4 of the mark, and so on.
  • non-blocking Presentation mark
    The Presentation mark is the average of the marks assigned during peer review.
  • non-blocking Final Test
Interim Assessment

Interim Assessment

  • 2024/2025 2nd module
    0.1 * Average review mark + 0.3 * Average test mark + 0.3 * Final Test + 0.3 * Presentation mark
Bibliography

Bibliography

Recommended Core Bibliography

  • Eco, U., Farina, G., & Mongiat Farina, C. (2015). How to Write a Thesis. Cambridge, Massachusetts: The MIT Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=963778

Recommended Additional Bibliography

  • Mikael Sundström. (2020). How Not to Write a Thesis or Dissertation. Edward Elgar Publishing.

Authors

  • RAK ARINA SERGEEVNA
  • Литвишкина Ален Витальевна
  • Ахмедова Гюнай Интигам кызы