Bachelor
2023/2024
Optimization Methods
Type:
Compulsory course (Data Science and Business Analytics)
Area of studies:
Applied Mathematics and Information Science
Delivered by:
Big Data and Information Retrieval School
Where:
Faculty of Computer Science
When:
3 year, 1, 2 module
Mode of studies:
distance learning
Online hours:
10
Open to:
students of one campus
Language:
English
ECTS credits:
5
Contact hours:
56
Course Syllabus
Abstract
Optimization holds an important place in both practical and theoretical worlds, as understanding the timing and magnitude of actions to be carried out helps achieve a goal in the best possible way. This course emphasizes data-driven modeling, theory and numerical algorithms for optimization with real variables. The course gives a comprehensive foundation for theory, methods and algorithms of mathematical optimization. The prerequisites are linear algebra and calculus.
Learning Objectives
- Students will study main concepts of optimization theory and develop a methodology for theoretical investigation of optimization problems.
- Students will obtain an understanding of creation, effectiveness and application optimization methods and algorithms on practice.
- The course will give students the possibility of solving standard and nonstandard mathematical problems connected to finding optimal solutions.
Expected Learning Outcomes
- Students should be able to acces the rate of convergence of the first and second order optimization methods.
- Students should be able to classify optimization problems according to their mathematical properties.
- Students should be able to describe the numerical complexity of the optimization algorithms studied during the course.
- Students should be able to implement different optimization codes in a computer environment.
- Students should be able to perform a theoretical investigation of a given optimization problem in order to access its complexity.
- Students should be able to solve simple optimization problems without computer.
- Students should be able to write down first and second-order optimality condition.
- Students should be able to write down first and second-order optimality conditions.
Course Contents
- One-dimensional optimization: unimodal functions, convex and quasiconvex functions, zero and first-order methods, local and global minima.
- Existence of solutions: continuous and lower semicontinuous functions, coercive functions, Weierstrass theorem, unique and nonunique solutions.
- Quadratic unconstrained optimization: algebraic solution, complete optimization analysis, steepest descent and conjugate gradient methods.
- Theory of optimality conditions: Fermat principle, the Hessian matrix, positive and negative semidefinite matrices, the Lagrange function and Lagrange multipliers, the Karush-Kuhn-Tucker conditions, regularity, complementarity constraints, stationary points.
- First-order optimization methods: the steepest descent method, conjugate directions, gradient-based methods.
- Second order optimization methods: Newton's method and modifications, trust-region methods.
- Convex optimization: optimality conditions, duality, subgradients and subdifferential, cutting planes and bundle methods, the complexity of convex optimization.
Assessment Elements
- Control Testing 1All testings consist of tasks, devoted to different topics of the course.
- Final testAll testings consist of tasks, devoted to different topics of the course.
- Control Testing 2All testings consist of tasks, devoted to different topics of the course.
- Control Testing 3All testings consist of tasks, devoted to different topics of the course.
Interim Assessment
- 2023/2024 2nd moduleGrade FG - Final Grade, for the course is determined by the formula: FG=0.7*(CT1+CT2+CT3)+0.3*FT, where CT - Control Testing 1,2, and 3, FT - Final Testing. In case a student has no objections to the total grade obtained during CT1, CT2 and CT3 + FT, he/she can get an automatic pass and get this grade. Otherwise this student will take an additional test with a maximum grade of 5 out of 10. If a student doesn't solve any tests during the semester, his/her grade at the exam is limited by 5 out of 10.
Bibliography
Recommended Core Bibliography
- Arkadi Nemirovski. (2001). Lectures on modern convex optimization. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.5E080C05
- Mokhtar S. Bazaraa, Hanif D. Sherali, & C. M. Shetty. (2006). Nonlinear Programming : Theory and Algorithms: Vol. 3rd ed. Wiley-Interscience.
Recommended Additional Bibliography
- Yurii Nesterov. (2018). Lectures on Convex Optimization (Vol. 2nd ed. 2018). Springer.