Deep Learning 1

Data is displayed for academic year: 2023./2024.


Laboratory exercises

Course Description

Deep learning is a branch of machine learning based on complex data representations obtained by a sequence of trained non-linear transformations. Deep learning methods have been successfully applied in many important artificial intelligence fields such as computer vision, natural language processing, speech and audio understanding as well as in bioinformatics. This course introduces the most important deep discriminative and generative models with a special focus on practical implementations. Part one introduces key elements of classical feed-forward neural networks and overviews basic building blocks, regularization techniques and learning procedures which are specific for deep models. Part two considers deep convolutional models and illustrates their application in image classification and natural language processing. Part three considers sequence modelling with deep recurrent models and illustrates applications in natural language processing. Finally, part four is devoted to generative deep models and their applications in vision and text representation. All concepts are followed with examples and exercies in Python. Most exercises shall be implemented in a suitable deep learning application framework (e.g. Tensorflow and PyTorch).

Study Programmes

University graduate
[FER3-HR] Audio Technologies and Electroacoustics - profile
Elective Courses (2. semester)
[FER3-HR] Communication and Space Technologies - profile
Elective Courses (2. semester)
[FER3-HR] Computational Modelling in Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Computer Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Computer Science - profile
Core-elective courses (2. semester)
[FER3-HR] Control Systems and Robotics - profile
Elective Courses (2. semester)
Elective Courses of the Profile (2. semester)
[FER3-HR] Data Science - profile
Elective Courses (2. semester)
Elective Coursesof the Profile (2. semester)
[FER3-HR] Electrical Power Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Electric Machines, Drives and Automation - profile
Elective Courses (2. semester)
[FER3-HR] Electronic and Computer Engineering - profile
Elective Courses (2. semester)
[FER3-HR] Electronics - profile
Elective Courses (2. semester)
[FER3-HR] Information and Communication Engineering - profile
Elective Courses (2. semester)
Elective Courses of the Profile (2. semester)
[FER3-HR] Network Science - profile
Elective Courses (2. semester)
[FER3-HR] Software Engineering and Information Systems - profile
Elective Course of the profile (2. semester)
Elective Courses (2. semester)
[FER2-HR] Computer Science - profile
Specialization Course (2. semester)

Learning Outcomes

  1. Explain advantages of deep learning with respect to the alternative machine learning approaches.
  2. Apply techniques for supervised training of fully connected, convolutional and recurrent deep models.
  3. Apply deep learning techniques in understanding of images and text.
  4. Analyze and evaluate the performance of deep models.
  5. Design deep models in a high-level programming language.

Forms of Teaching


13 lectures of three hours, only in the winter semester.

Independent assignments

Students may receive additional points for presenting a technical seminar.


Two exercises in each half of the semester.

Grading Method

Continuous Assessment Exam
Type Threshold Percent of Grade Threshold Percent of Grade
Laboratory Exercises 50 % 20 % 50 % 0 %
Mid Term Exam: Written 0 % 40 % 0 %
Final Exam: Written 0 % 40 %
Exam: Written 50 % 80 %
Exam: Oral 20 %

Week by Week Schedule

  1. Motivation for deep learning; course information; basics of machine learning.
  2. Fully connected feed-forward models; loss functions; activation functions; universal approximation theorem.
  3. Recovering gradients by backward propagation; learning with gradient descent; differentiable programming.
  4. Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
  5. Convolutional models for computer vision; convolutional layers; pooling layers; efficient implementation; classification architectures.
  6. Convolutional backprop; recovering gradient in 1D and 2D case; implementation details.
  7. Optimization with gradient descent; challenges in learning deep models; learning with momentum.
  8. Midterm exam
  9. Nesterov accelerated momentum, adaptive momentum; normalization of activations: LRN, batchnorm.
  10. Correspondence embeddings; Siamese models; triplet loss.
  11. Regularization; penalizing the parameter norm (weight decay); data augmentation and noise injection; early stopping; parameter tying and sharing; bagging and ensembling; excluding activations (dropout).
  12. Recurrent models; sequence modeling; applications in natural language understanding; formulation and optimization.
  13. RNN cell extensions; exploding gradient; advanced RNN cells; attention.
  14. Details of convolutional models; residual and dense connectivity; advanced techniques; model interpretation and prediction explanation; object detection; dense prediction; real time; challenges.
  15. Final exam


Ian Goodfellow, Yoshua Bengio, Aaron Courville (2016.), Deep Learning, MIT Press
Nikhil Buduma, Nicholas Locascio (2017.), Fundamentals of Deep Learning, "O'Reilly Media, Inc."

For students


ID 252377
  Summer semester
L1 English Level
L1 e-Learning
45 Lectures
0 Seminar
15 Exercises
8 Laboratory exercises
0 Project laboratory
0 Physical education excercises

Grading System

89 Excellent
76 Very Good
63 Good
50 Sufficient