Computational Statistics

Course Description

Introduction to modern computing methods used in situations where classical statistical methods are not applicable (e.g., bootstrap), and when one wants to test the behavior, robustness or predictability of some method or model (e.g., for estimation of prediction error of bagged classification trees).

Learning Outcomes

  1. Distinguish between random number generators
  2. Define and design Monte Carlo experiments
  3. Apply Monte Carlo estimation
  4. Explain the principles of graphical methods in computational statistics
  5. Apply repeated sampling methods
  6. Apply statistical learning methods to real world problems

Forms of Teaching

Lectures

Independent assignments

Laboratory

Week by Week Schedule

  1. Markov Chain Monte Carlo sampling methods: Gibbs sampling and the Metropolis-Hastings algorithm
  2. Markov Chain Monte Carlo sampling methods: Gibbs sampling and the Metropolis-Hastings algorithm
  3. Monte Carlo inference (rejection sampling, importance sampling)
  4. Monte Carlo inference (rejection sampling, importance sampling)
  5. Nonparametric density estimation (estimation of a density, role of the bandwidth, higher dimensions)
  6. Classification of resampling methods: randomisation (exact and approximate), Jackknife, Bootstrap, Cross-validation
  7. Bootstrap (Efron's nonparametric bootstrap, double bootstrap, model-based bootstrap)
  8. Midterm exam
  9. Bootstrap (Efron's nonparametric bootstrap, double bootstrap, model-based bootstrap)
  10. Nonparametric regression (kernel regression estimator, local polynomial nonparametric regression estimator, smoothing splines and penalized regresssion)
  11. Nonparametric regression (kernel regression estimator, local polynomial nonparametric regression estimator, smoothing splines and penalized regresssion)
  12. Flexible regression and classification (additive models, projection pursuit regression, classification and regression trees, Variable selection, regularization, ridging and the LASSO)
  13. Flexible regression and classification (additive models, projection pursuit regression, classification and regression trees, Variable selection, regularization, ridging and the LASSO)
  14. Bagging and boosting
  15. Final exam

Study Programmes

University graduate
Audio Technologies and Electroacoustics (profile)
Free Elective Courses (2. semester)
Communication and Space Technologies (profile)
Free Elective Courses (2. semester)
Computational Modelling in Engineering (profile)
Free Elective Courses (2. semester)
Computer Engineering (profile)
Free Elective Courses (2. semester)
Computer Science (profile)
Free Elective Courses (2. semester)
Control Systems and Robotics (profile)
Free Elective Courses (2. semester)
Data Science (profile)
Elective Coursesof the Profile (2. semester)
Electrical Power Engineering (profile)
Free Elective Courses (2. semester)
Electric Machines, Drives and Automation (profile)
Free Elective Courses (2. semester)
Electronic and Computer Engineering (profile)
Free Elective Courses (2. semester)
Electronics (profile)
Free Elective Courses (2. semester)
Information and Communication Engineering (profile)
Free Elective Courses (2. semester)
Network Science (profile)
Free Elective Courses (2. semester)
Software Engineering and Information Systems (profile)
Free Elective Courses (2. semester)

Literature

(.), J. E. Gentle, Elements of Computational Statistics, Springer Verlag, New York, 2002.,
(.), 1. L. Devroye, Non-Uniform Random Variate Generation, Springer Verlag, 1986.,
(.), 2. C. R. Rao, Handbook of Statistics, Vol. 9: Computational Statistics, North Holland, 1993.,
(.), 3. T. Hasti, R. Tibshirani, J. Friedman, The Elements of Statistical Learning, Springer Verlag, 2001.,
(.), 4. A. C. Davison, D. V. Hinkley, Bootstrap Methods and Their Application, Cambridge University Press, 1997.,

Associate Lecturers

For students

General

ID 222761
  Summer semester
5 ECTS
L3 English Level
L1 e-Learning
45 Lectures
15 Laboratory exercises

Grading System

Excellent
Very Good
Good
Acceptable