- The results of the second exam are now online.
- The exam inspection will take place on 29.04.2015 (Wednesday) from 15:00 to 17:00 in room 633 of the MPII (the meeting room on the 6th floor).
- Date and time: 10.04.2015 (Friday), 9:00 - 12:00
- Location: E2 2, Günter-Hotz-Hörsaal
- Please bring your student identity card - otherwise you are not allowed to the exam !
- Please bring paper for the exam.
- Be there at least 10 min earlier in order to check in which lecture hall you are and so that we can start on time. We will send around in which lecture halls the exam will take place soon.
- It is a closed book exam
- no notes, books or pocket calculators or any other devices except pen and paper are allowed. Mobile phones, tablets, laptops and other electronic devices have to be turned off.
- The list of students admitted to the exam is now online.
- The results of the first exam are now online.
- The exam inspection will take place on 26.03.2015 (Thursday) from 15:00 to 17:00 in room 633 of the MPII (the meeting room on the 6th floor).
- Thursday tutorial Feb 12th will take place on Room 023 E1 4.
- No lecture on Friday, Feb 6th. Lecture will resume on Monday, Feb 9th
- Group 2 (tutor Maksim): no exercise classes on January 9th
- Exam date: February 26th 2015, 2pm-5pm
- Re-exam date: April 10th 2015 9am-noon
- No exercise classes on November 13th and 14th.
- There will be no lecture on October 20th due to the Semester Kick-Off Event and on October 24th due to conference.
- For late registrations please send an email to firstname.lastname@example.org
- As announced in the lecture, the exercise classes given by the tutors start November 6th and November 7th respectively.
- Please find your assignment to the exercise groups: list_revised
In a broader perspective machine learning tries to automatize the process of empirical sciences – namely extracting knowledge about natural phenomena from measured data with the goal to either understand better the underlying processes or to make good predictions. Machine learning methods are therefore widely used in different fields: bioinformatics, computer vision, information retrieval, computer linguistics, robotics,…
The lecture gives a broad introduction into machine learning methods. After the lecture the students should be able to solve and analyze learning problems. The lecture is based on the machine learning lecture of Matthias Hein.
- Semester: winter term
- Year: 2014
- Type: Core lecture (Stammvorlesung), 9 credit points
- Time and Location:
- Monday 4pm-6pm, lecture hall HS002 in E1 3
- Friday 2pm-4pm, lecture hall HS002 in E1 3
- Exercise groups
- Thursday 10am-12pm, seminar room 024 in E1 4
- Friday 10am-12pm, seminar room 024 in E1 4
- Friday 10am-12pm, seminar room 022 in E1 4
- handed in in groups of 3
- 50% of the points (up to that point) are needed to take part in the exams (end-term/re-exam).
List of topics (tentative)
- Reminder of probability theory
- Maximum Likelihood/Maximum A Posteriori Estimators
- Bayesian decision theory
- Linear classification and regression
- Kernel methods
- Model selection and evaluation of learning methods
- Feature selection
- Nonparametric methods
- Boosting, Decision trees
- Neural networks
- Structured Output
- Semi-supervised learning
- Unsupervised learning (Clustering, Independent Component Analysis)
- Dimensionality Reduction and Manifold Learning
- Statistical learning theory
Previous knowledge of machine learning is not required. The participants should be familiar with linear algebra, analysis and probability theory on the level of the local `Mathematics for Computer Scienticists I-III’ lectures. In particular, attendees should be familiar with
- Discrete and continuous probability theory (marginals, conditional probability, random variables, expectation etc.)
The first three chapters of: L. Wasserman: All of Statistics, Springer, (2004) provide the necessary background
- Linear algebra (rank, linear systems, eigenvalues, eigenvectors (in particular for symmetric matrices), singular values, determinant)
A quick reminder of the basic ideas of linear algebra can be found in the tutorial of Mark Schmidt (I did not check it for correctness!). Apart from the LU factorization this summarizes all what is used in the lecture in a non-formal way. You might also find the following sheets on matrix identities and gaussian identities useful from Sam Roweis useful.
- Multivariate analysis (integrals, gradient, Hessian, extrema of multivariate functions)
- Exercise sheet 1_v2 due on 3.11.2014 (solution)
- Exercise sheet 2 v3 due on 14.11.2014 (solution)
- Exercise sheet 3 due on 21.11.2014 (solution)
- Exercise sheet 4 and data due on 1.12.2014 (solution)
- Exercise sheet 5 and data due on 8.12.2014 (solution)
- Exercise sheet 6 and data due on 15.12.2014 (solution)
- Exercise sheet 7 and data due on 05.01.2015 (solution)
- Exercise sheet 8 and data due on 12.01.2015 (solution)
- Exercise Sheet 9 and data due on 19.01.2015 (solution)
- Exercise Sheet 10 due on 26.01.2015 (solution)
- Exercise Sheet 11 and data due on 02.02.2015 (solution)
- Exercise Sheet 12 and data due on 09.02.2015 (solution)
The lecture will be partially based on the following books and partially on recent research papers:
- R.O. Duda, P.E. Hart, and D.G.Stork: Pattern Classification, Wiley, (2000).
- B. Schoelkopf and A. J. Smola: Learning with Kernels, MIT Press, (2002).
- J. Shawe-Taylor and N. Christianini: Kernel Methods for Pattern Analysis, Cambridge University Press, (2004).
- C. M. Bishop: Pattern recognition and Machine Learning, Springer, (2006).
- T. Hastie, R. Tibshirani, J. Friedman: The Elements of Statistical Learning, Springer, second edition, (2008).
- L. Devroye, L. Gyoerfi, G. Lugosi: A Probabilistic Theory of Pattern Recognition, Springer, (1996).
- L. Wasserman: All of Statistics, Springer, (2004).
- S. Boyd and L. Vandenberghe: Convex Optimization, Cambridge University Press, (2004).
- Matlab is accessible via our campus license. Details how to use it can be found here
Access from outside should be possible via ssh: ssh -X email@example.com
- Material for Matlab: