Neural Networks

Data is displayed for academic year: 2023./2024.

Laboratory exercises

Course Description

The course enables students to gain knowledge of the theory and applications of artificial neural networks: Biological neural networks. Artificial neural networks. Definition. Models of neurons. Activation function. Network topologies. Perceptron. Laws of learning. Associative networks. Linear association. Recursive associative networks. Hopfield Network. Energy function. Multilayer networks. Radial networks. Support vectors machines. Delta rule for error back propagation. Kohonen's self-organizing network. K mean values algorithm. Boltzmann machine. Simulated annealing. Simulation software packages. Applications in pattern recognition and signal and image analysis.

Study Programmes

University graduate
[FER3-EN] Data Science - profile
Elective courses (1. semester)
Recommended elective courses (3. semester)

Learning Outcomes

  1. Understanding the basic concepts of neural networks
  2. Ability to create solutions based on neural networks
  3. Ability to adapt existing neural networks to a new problem
  4. Ability to use existing software frameworks for neural networks
  5. Ability to evaluate the performance of neural network based solutions  

Forms of Teaching


The lectures present theoretical concepts and algorithms followed by concrete examples.


Laboratory exercises are done with computers. During the exercise, students try out theoretical concepts and apply them to specific problems


Team project in which students solve a real practical problem of biomedical image analysis

Grading Method

Continuous Assessment Exam
Type Threshold Percent of Grade Threshold Percent of Grade
Laboratory Exercises 50 % 20 % 50 % 20 %
Seminar/Project 20 % 20 % 20 % 20 %
Mid Term Exam: Written 20 % 30 % 0 %
Final Exam: Written 20 % 30 %
Exam: Written 50 % 60 %

The threshold on the sum of the midterm and the final exam is 50%.

Week by Week Schedule

  1. Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
  2. Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
  3. Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
  4. Multilayer perceptron (error-backpropagation learning, credit-assignment problem, backpropagation through time)
  5. Multilayer perceptron (error-backpropagation learning, credit-assignment problem, backpropagation through time)
  6. Radial basis function networks (solving interpolation problem with radial basis function networks, generalized radial basis function networks, relation to regularization theory)
  7. Support vector machine for classification
  8. Midterm exam
  9. Recurrent neural networks (Hopfield network, Hopfield network energy function, Boltzman machine, Elman networks, Jordan networks) and learning algorithms (back propagation through time, reccurent backpropagation)
  10. Self-organizing networks (Hebbian non-supervised learning, Oja's learning rule, PCA using self-organizing network, Sanger's learning rule, Competitive non-supervised learning, winner-takes-all network, Kohonen's self-organizing maps)
  11. Network ensembles (committee machines, mixture of experts, convolutional neural networks)
  12. Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
  13. Deep generative models: stacked RBMs, convolutional autoencoders, variational autoencoders, adversarial models, sparsity              
  14. Project
  15. Final exam


(.), S. Haykin (1998.), Neural Networks, 2nd Ed., Prentice Hall,
(.), J. A. Anderson (1995.), An Introduction to Neural Networks, MIT Press,

For students


ID 222982
  Winter semester
L1 English Level
L2 e-Learning
30 Lectures
0 Seminar
0 Exercises
15 Laboratory exercises
0 Project laboratory

Grading System

87 Excellent
75 Very Good
63 Good
51 Sufficient