Machine Learning 2

Learning Outcomes

  1. Define the basic concepts of computational learning theory
  2. Explain the PAC and VC frameworks
  3. Apply computational learning theory methods to basic supervised machine learning algorithms
  4. Explain the theoretical assumptions, advantages, and limitations of bayesian machine learning
  5. Define the bayesian variants of classification and regression models
  6. Explain the theoretical assumptions, advantages, and limitations of sparse kernel machines
  7. Explain the main approaches to semisupervised machine learning and list their advantages and disadvantages

Forms of Teaching

Lectures

Exercises

Independent assignments

Laboratory

Week by Week Schedule

  1. Machine learning approaches and paradigms
  2. Overfitting and model selection, empirical and structural risk minimization
  3. Probably approximately correct learning (PAC), U-learnability
  4. Probably approximately correct learning (PAC), U-learnability
  5. Vapnik-Chervonenkis dimension
  6. Online learning, Weak learning and boosting
  7. Self-training and co-training, Propagation methods (label propagation, random walks), Semisupervised and transductive support vector machine, Graph-based algorithms, Multiview algorithms
  8. Midterm exam
  9. Beta-binomial model, Dirichlet-multinomial model
  10. Ridge regression, Bayesian linear regression, Bayesian logistic regression
  11. Sparse linear models (lasso, coordinate descent, LAS, group lasso, proximal and gradient projection methods, elastic net), Kernel functions (RBF, graph kernels, Mercer kernels, linear kernels), Kernel machines and sparse kernel machines
  12. Bayesian networks, Markov networks, Variational inference (variational Bayes, belief propagation), Monte Carlo inference (rejection sampling, importance sampling)
  13. Markov Chain Monte Carlo sampling methods: Gibbs sampling and the Metropolis-Hastings algorithm, Mixture models, Expectation maximization algorithm, Markov and hidden Markov models, Conditional random fields
  14. Bayesian ensambles (Bayesian parameter averaging, Bayesian model combination)
  15. Final exam

Study Programmes

University graduate
Audio Technologies and Electroacoustics (profile)
Free Elective Courses (3. semester)
Communication and Space Technologies (profile)
Free Elective Courses (3. semester)
Computational Modelling in Engineering (profile)
Free Elective Courses (3. semester)
Computer Engineering (profile)
Free Elective Courses (3. semester)
Computer Science (profile)
Elective Courses of the Profile (3. semester)
Control Systems and Robotics (profile)
Elective Courses of the Profile (3. semester)
Data Science (profile)
Elective Courses of the Profile (3. semester)
Electrical Power Engineering (profile)
Free Elective Courses (3. semester)
Electric Machines, Drives and Automation (profile)
Free Elective Courses (3. semester)
Electronic and Computer Engineering (profile)
Free Elective Courses (3. semester)
Electronics (profile)
Free Elective Courses (3. semester)
Information and Communication Engineering (profile)
Elective Coursesof the Profile (3. semester)
Network Science (profile)
Free Elective Courses (3. semester)
Software Engineering and Information Systems (profile)
Elective Course of the profile (3. semester)

Literature

(.), Shai Ben-David, Shai Shalev-Shwartz: Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.,
(.), Bernhard Schölkopf: Learning with Kernels, The MIT Press, 2001.,
(.), Daphne Koller, Nir Friedman: Probabilistic Graphical Models: Principles and Techniques. The MIT Press, 2009.,

For students

General

ID 222787
  Winter semester
5 ECTS
L3 English Level
L1 e-Learning
30 Lectures
15 Exercises
15 Laboratory exercises