Selected Optimization Methods and Algorithms

Data is displayed for the academic year: 2024./2025.

Course Description

Optimization is the process of setting something to the best possible state with respect to the requirements and constraints. Optimization has an increasing role in contemporary society and economy, so this course provides students a survey of the most important achievements of this field of science, with an emphasis on applied knowledge. What follows is a principal list of themes. Theoretical foundation of optimization: differentials, Directional derivative, Taylor's formula, Extrema, Convexity and concavity of a function, O-notation in optimization. An introduction to optimization theory. Nonconstrained optimization: Optimization in one dimension, Gradient Methods, Step Size, Stopping Criteria, The Newton's Method, Conjugate Gradient Methods, Quasi-Newton Methods, Trust-Region Methods, Nelder-Mead Simplex Algorithm. Constrained optimization: Karush-Kuhn-Tucker optimality conditions, Lagrangian Duality, Projections, Lagrangian algorithms, Penalty methods, Augmented Lagrangian method. Selected heuristical methods (Simulated Annealing, Particle Swarm, Ant Colony, genetic algorithms). Multiobjective Optimization.

Prerequisites

(intermediate level): Multivariable calculus, Linear Algebra, programming skills (and/or using Matlab or similar tool)

Study Programmes

University graduate
[FER3-HR] Audio Technologies and Electroacoustics - profile
Elective Courses (2. semester)
[FER3-HR] Communication and Space Technologies - profile
Elective Courses (2. semester)
[FER3-HR] Computational Modelling in Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Computer Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Computer Science - profile
Elective Courses (2. semester)
[FER3-HR] Control Systems and Robotics - profile
Elective Courses (2. semester)
[FER3-HR] Data Science - profile
Elective Courses (2. semester)
[FER3-HR] Electrical Power Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Electric Machines, Drives and Automation - profile
Elective Courses (2. semester)
[FER3-HR] Electronic and Computer Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Electronics - profile
Elective Courses (2. semester)
[FER3-HR] Information and Communication Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Network Science - profile
Elective Courses (2. semester)
[FER3-HR] Software Engineering and Information Systems - profile
Elective Course of the profile (2. semester)
Elective Courses (2. semester)

Learning Outcomes

  1. recognize and analyze the problem as an optimization one
  2. relate the specified problem to the one of the same kind or a similar known and already solved problem
  3. select the best method for a known problem
  4. adjust methods and algorithms for similar problems so that they become suitable for a given one
  5. design one’s own solution for an unknown problem
  6. assess the validity of one’s own solution, in the sense of algorithm convergence and complexity
  7. compare and contrast own solution to possible alternatives
  8. recommend the algorithm for an unknown problem on the basis of analysis of alternatives

Forms of Teaching

Lectures

Lecturer's exposition and further clarifications based on students' questions.

Independent assignments

Each student will theoretically and practically solve an optimization problem on his/her own choice.

Work with mentor

Assistance to students with their chosen optimization problems.

Grading Method

Continuous Assessment Exam
Type Threshold Percent of Grade Threshold Percent of Grade
Seminar/Project 0 % 70 % 0 % 0 %
Final Exam: Oral 30 %
Exam: Written 0 % 70 %
Exam: Oral 30 %

Week by Week Schedule

  1. Constant, logarithmic, linear, quadratic, and exponential complexity classes
  2. Tangent and normal lines to the graph of function; Increasing and decreasing functions
  3. Directional derivative; Derivative of implicit function; Theorem of implicit function, Extrema; Local extrema
  4. Second differential and quadratic forms, Taylor's formula
  5. Convexity and concavity of a function; Finding extrema of a function, necessary and sufficient conditions, Optimization in one dimension (golden section search, successive parabolic interpolation, Newton's method)
  6. Direct Search Algorithms (the Hooke-Jeeves method); Gradient Methods (the steepest descent); , The Newton's Method; The Secant Method, Stopping Criteria
  7. Conjugate Gradient Methods, Quasi-Newton Methods
  8. Midterm exam
  9. Trust-Region Methods
  10. Extrema of a function subject to constraints; Lagrange mutliplier, Augmented Lagrangian Method
  11. Convex optimization problems
  12. Duality; The Lagrange function; Perturbation and sensitivity analysis
  13. Deterministic and heuristic approaches
  14. Meta-heuristics: simulated annealing, tabu search, evolutionary strategies, ant colony optimization, bee mating optimization, and greedy randomized adaptive search procedures (GRASP)
  15. Final exam

Literature

Edwin K. P. Chong, Wu-Sheng Lu, and Stanislaw H. Żak (2023.), An Introduction to Optimization, 5th Edition, With Applications to Machine Learning, Wiley
Jorge Nocedal, Stephen J. Wright (2006.), Numerical Optimization, 2nd Edition, Springer
Stephen Boyd, Lieven Vandenberghe (2004.), Convex Optimization, Cambridge University Press
Carl D. Meyer (2000.), Matrix Analysis and Applied Linear Algebra, SIAM
Yurii Nesterov (2018.), Lectures on Convex Optimization, Springer
G.A.F. Seber and C. J. Wild (2005.), Nonlinear Regression, Wiley
Omid Bozorg-Haddad, Mohammad Solgi, Hugo A. Loáiciga (2017.), Meta-heuristic and Evolutionary Algorithms for Engineering Optimization, John Wiley & Sons

General

ID 222557
  Summer semester
5 ECTS
L1 English Level
L1 e-Learning
45 Lectures
0 Seminar
0 Exercises
15 Laboratory exercises
0 Project laboratory
0 Physical education excercises

Grading System

90 Excellent
75 Very Good
60 Good
50 Sufficient