Deep Learning 1
Deep learning is a branch of machine learning based on complex data representations obtained by a sequence of trained non-linear transformations. Deep learning methods have been successfully applied in many important artificial intelligence fields such as computer vision, natural language processing, speech and audio understanding as well as in bioinformatics. This course introduces the most important deep discriminative and generative models with a special focus on practical implementations. Part one introduces key elements of classical feed-forward neural networks and overviews basic building blocks, regularization techniques and learning procedures which are specific for deep models. Part two considers deep convolutional models and illustrates their application in image classification and natural language processing. Part three considers sequence modelling with deep recurrent models and illustrates applications in natural language processing. Finally, part four is devoted to generative deep models and their applications in vision and text representation. All concepts are followed with examples and exercies in Python. Most exercises shall be implemented in a suitable deep learning application framework (e.g. Tensorflow and PyTorch).
- Explain advantages of deep learning with respect to the alternative machine learning approaches.
- Apply techniques for training of deep models.
- Explain application fields of deep discriminative and generative models.
- Apply deep learning techniques in understanding of images and text.
- Distinguish kinds of deep models which are appropriate in supervised, semi-supervised and unsupervised applications.
- Analyze and evaluate the performance of deep models.
- Design deep models in a high-level programming language.
Forms of Teaching
13 lectures of three hours each.Independent assignments
Students may receive additional points for presenting a technical seminar.Laboratory
Two exercises in each half of the semester.
|Type||Threshold||Percent of Grade||Threshold||Percent of Grade|
|Laboratory Exercises||40 %||20 %||50 %||0 %|
|Mid Term Exam: Written||0 %||40 %||0 %|
|Final Exam: Written||0 %||40 %|
|Exam: Written||50 %||80 %|
|Exam: Oral||20 %|
Week by Week Schedule
- Motivation for deep learning; course information; basics of machine learning.
- Fully connected feed-forward models; loss functions; activation functions; universal approximation theorem.
- Recovering gradients by backward propagation; learning with gradient descent; differentiable programming.
- Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
- Convolutional models for computer vision; convolutional layers; pooling layers; efficient implementation; classification architectures.
- Convolutional backprop; recovering gradient in 1D and 2D case; implementation details.
- Optimization with gradient descent; challenges in learning deep models; learning with momentum.
- Midterm exam
- Nesterov accelerated momentum, adaptive momentum; normalization of activations: LRN, batchnorm.
- Correspondence embeddings; Siamese models; triplet loss.
- Regularization; penalizing the parameter norm (weight decay); data augmentation and noise injection; early stopping; parameter tying and sharing; bagging and ensembling; excluding activations (dropout).
- Recurrent models; sequence modeling; applications in natural language understanding; formulation and optimization.
- RNN cell extensions; exploding gradient; advanced RNN cells; attention.
- Details of convolutional models; residual and dense connectivity; advanced techniques; model interpretation and prediction explanation; object detection; dense prediction; real time; challenges.
- Final exam
Audio Technologies and Electroacoustics (profile)Free Elective Courses (2. semester)
Communication and Space Technologies (profile)Free Elective Courses (2. semester)
Computational Modelling in Engineering (profile)Free Elective Courses (2. semester)
Computer Engineering (profile)Free Elective Courses (2. semester)
Computer Science (profile)Core-elective courses (2. semester)
Control Systems and Robotics (profile)Elective Courses of the Profile (2. semester)
Data Science (profile)Elective Coursesof the Profile (2. semester)
Electrical Power Engineering (profile)Free Elective Courses (2. semester)
Electric Machines, Drives and Automation (profile)Free Elective Courses (2. semester)
Electronic and Computer Engineering (profile)Free Elective Courses (2. semester)
Electronics (profile)Free Elective Courses (2. semester)
Information and Communication Engineering (profile)Elective Courses of the Profile (2. semester)
Network Science (profile)Free Elective Courses (2. semester)
Software Engineering and Information Systems (profile)Elective Course of the profile (2. semester)
Ian Goodfellow, Yoshua Bengio, Aaron Courville (2016.), Deep Learning, MIT Press
Nikhil Buduma, Nicholas Locascio (2017.), Fundamentals of Deep Learning, "O'Reilly Media, Inc."
L1 English Level
8 Laboratory exercises
76 Very Good