Neural Networks
Data is displayed for academic year: 2023./2024.
Laboratory exercises
Course Description
The course enables students to gain knowledge of the theory and applications of artificial neural networks.
The following topics are covered: Biological and artificial neural networks. Models of neurons. Activation function. Network topologies. Perceptron. Laws of learning. Associative networks. Linear association. Multilayer networks. Delta rule for error back propagation. . Support vectors machines. Radial basis function networks. Recursive networks. Hopfield Network. Energy function. Boltzmann machine. Simulated annealing. K-means algorithm. Kohonen's self-organizing network. Simulation software packages. Applications in pattern recognition and in signal and image analysis.
The following topics are covered: Biological and artificial neural networks. Models of neurons. Activation function. Network topologies. Perceptron. Laws of learning. Associative networks. Linear association. Multilayer networks. Delta rule for error back propagation. . Support vectors machines. Radial basis function networks. Recursive networks. Hopfield Network. Energy function. Boltzmann machine. Simulated annealing. K-means algorithm. Kohonen's self-organizing network. Simulation software packages. Applications in pattern recognition and in signal and image analysis.
Study Programmes
University graduate
[FER3-EN] Data Science - profile
Elective courses
(1. semester)
Recommended elective courses
(3. semester)
Learning Outcomes
- Understanding the basic concepts of neural networks
- Ability to create solutions based on neural networks
- Ability to adapt existing neural networks to a new problem
- Ability to use existing software frameworks for neural networks
- Ability to evaluate the performance of neural network based solutions
Forms of Teaching
Lectures
The lectures present theoretical concepts and algorithms followed by concrete examples.
LaboratoryLaboratory exercises are done with computers. During the exercise, students try out theoretical concepts and apply them to specific problems
OtherTeam project in which students solve a real practical problem of biomedical image analysis
Grading Method
Continuous Assessment | Exam | |||||
---|---|---|---|---|---|---|
Type | Threshold | Percent of Grade | Threshold | Percent of Grade | ||
Laboratory Exercises | 50 % | 20 % | 50 % | 20 % | ||
Seminar/Project | 20 % | 20 % | 20 % | 20 % | ||
Mid Term Exam: Written | 20 % | 30 % | 0 % | |||
Final Exam: Written | 20 % | 30 % | ||||
Exam: Written | 50 % | 60 % |
Comment:
The threshold on the sum of the midterm and the final exam is 50%.
Week by Week Schedule
- Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
- Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
- Perceptron (learning paradigms,Hebbian learning, competitive learning, Boltzmann learning)
- Multilayer perceptron (error-backpropagation learning, credit-assignment problem, backpropagation through time)
- Multilayer perceptron (error-backpropagation learning, credit-assignment problem, backpropagation through time)
- Radial basis function networks (solving interpolation problem with radial basis function networks, generalized radial basis function networks, relation to regularization theory)
- Support vector machine for classification
- Midterm exam
- Recurrent neural networks (Hopfield network, Hopfield network energy function, Boltzman machine, Elman networks, Jordan networks) and learning algorithms (back propagation through time, reccurent backpropagation)
- Self-organizing networks (Hebbian non-supervised learning, Oja's learning rule, PCA using self-organizing network, Sanger's learning rule, Competitive non-supervised learning, winner-takes-all network, Kohonen's self-organizing maps)
- Network ensembles (committee machines, mixture of experts, convolutional neural networks)
- Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
- Deep generative models: stacked RBMs, convolutional autoencoders, variational autoencoders, adversarial models, sparsity
- Project
- Final exam
Literature
Simon S. Haykin (2009.), Neural Networks and Learning Machines, Prentice Hall
James A. Anderson (1995.), An Introduction to Neural Networks, MIT Press
Charu C. Aggarwal (2023.), Neural Networks and Deep Learning, Springer Nature
For students
General
ID 222982
Winter semester
5 ECTS
L1 English Level
L2 e-Learning
30 Lectures
0 Seminar
0 Exercises
15 Laboratory exercises
0 Project laboratory
0 Physical education excercises
Grading System
87 Excellent
75 Very Good
63 Good
51 Sufficient