Information Theory
Data is displayed for the academic year: 2024./2025.
Laboratory exercises
Course Description
Introduction to the quantitative Shannon’s theory of information and its applications. Mathematical definition and properties of information. Message. Amount information carried by a message. Entropy. Mutual information. Discrete channel capacity. Discrete memoryless sources. Discrete sources with memory (Markov chains). Source coding theorem, lossless data compression and optimal lossless coding. Shannon-Fano coding. Huffman coding. Arithmetic coding. Dictionary-based coding (LZ77, LZW). Error control. Parity coding. Cyclic code. Hamming code. Convolutional codes. Viterbi algorithm.
Prerequisites
Mathematical Analysis
Study Programmes
University undergraduate
[FER3-EN] Computing - study
(3. semester)
Learning Outcomes
- identify information, coding and communication problems
- explain coding and compression methods and information limits
- apply accepted knowledge to real systems analysis
- analyze complex information and communication systems
- explain phenomens in different areas of science
- estimate performances of different information and communication systems
- apply techniques of entropy and error correcting codes
Forms of Teaching
Lectures
Lectures are held in three hour units.
Independent assignmentsStudents solve problems, not mandatory for everyone.
LaboratoryMandatory for all students, each subgroup solves one problem.
Grading Method
Continuous Assessment | Exam | |||||
---|---|---|---|---|---|---|
Type | Threshold | Percent of Grade | Threshold | Percent of Grade | ||
Mid Term Exam: Written | 10 % | 50 % | 0 % | |||
Final Exam: Written | 10 % | 50 % | ||||
Exam: Written | 40 % | 100 % |
Comment:
Although laboratory exercises does not contribute to a total number of points won on this course, their accomplishment is necessary requirement.
Week by Week Schedule
- Information theory history and importance, Symbol, message, information, communication
- Discrete communication system, probabilistic view and information measures
- Entropy, noiseless coding theorem, Mutual information
- Information sources
- Types of codes, Optimal code, Entropy coding
- Entropy coding
- Entropy coding, Lossy coding
- Midterm exam
- Error detecting and correcting codes, block codes
- Hamming distance, code equivalence, perfect codes
- Binary linear block codes, generating matrix, parity check matrix, syndrom
- Types of binary linear block codes
- Convolutional and turbo coding
- Channel capacity, noisy-channel coding theorem
- Final exam
Literature
Igor S. Pandžić et al. (2009.), Uvod u teoriju informacije i kodiranje, Element
Željko Ilić, Alen Bažant, Tomaž Beriša (2014.), Teorija informacije i kodiranje, Element
Roberto Togneri, Christopher J.S deSilva (2003.), Fundamentals of Information Theory and Coding Design, CRC Press
General
ID 209649
Winter semester
4 ECTS
L0 English Level
L1 e-Learning
45 Lectures
0 Seminar
0 Exercises
15 Laboratory exercises
0 Project laboratory
0 Physical education excercises
Grading System
85 Excellent
70 Very Good
55 Good
40 Sufficient