2024/2025
Introduction to Deep Learning
Type:
Mago-Lego
Delivered by:
Big Data and Information Retrieval School
When:
2 module
Online hours:
60
Open to:
students of one campus
Language:
English
ECTS credits:
3
Course Syllabus
Abstract
In this course we discuss a variety of Deep Learning scientific papers and practice understanding, presenting and reviewing them. We cover papers in different areas of DL such as Natural Language Processing and Computer vision and move from basic to more complicated ideas and more recent developments. The course programme is different for two “tracks”: the deep learning one (for students who have taken the introduction to deep learning course) and the math track. The deep learning track covers more complex papers which require the understanding of basic neural network architectures and traning process. The math track focuses on the basic consepts and tries to introduce them on examples of fundamental papers and overviewing blog posts.
Learning Objectives
- to be able to critically read and profoundly understand a scientific paper
- to write a peer review
- to be able to make a scientific presentation
Expected Learning Outcomes
- Form an understanding of the core tools of the course
- Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the NeRF / Word2Vec overview
- Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the Attention Is All you need / Model-agnostic meta-learning for fast adaptation of deep networks
- Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the CLIP / GAN
- Gained experience in learning, presenting, reviewing and discussing a paper, a deep understanding of the Typical Decoding / NeRF
- Gained experience in learning, presenting, reviewing and discussing papers, a deep understanding of the Robustness May be at Odds with Accuracy
Course Contents
- Introduction
- Papers: NeRF / Word2Vec
- Papers: Attention Is All you need / Model-agnostic meta-learning for fast adaptation of deep networks
- Papers: Typical decoding / GAN
- Papers: How Can We Know What Language Models Know? / NeRF
- Papers: Robustness May be at Odds with Accuracy
Assessment Elements
- Average test mark
- Average review markThe review mark is averaged along all weeks when the student writes a review. The penalties for submitting the review after the deadline are progressive: 0.2 is the base penalty for submitting late. Every 24 hours it increases by 0.1 until the review is submitted. So for the review deadline on Thursday at 23:59, submitting it on Friday at 00:01 would be fined 0.2 of the mark, on Saturday 0.3 of the mark, on Sunday 0.4 of the mark, and so on.
- Presentation markThe Presentation mark is the average of the marks assigned during peer review.
- Final Test
Interim Assessment
- 2024/2025 2nd module0.1 * Average review mark + 0.3 * Average test mark + 0.3 * Final Test + 0.3 * Presentation mark
Bibliography
Recommended Core Bibliography
- Eco, U., Farina, G., & Mongiat Farina, C. (2015). How to Write a Thesis. Cambridge, Massachusetts: The MIT Press. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsebk&AN=963778
Recommended Additional Bibliography
- Mikael Sundström. (2020). How Not to Write a Thesis or Dissertation. Edward Elgar Publishing.