Machine Learning for Computer Vision (IN2357) (2h + 2h, 5ECTS)
SS 2018, TU München
You can use our library for the programming exercises: mlcv-tutorial
April, 17th: The rooms for both lecture and tutorial have changed. See below.
April, 20th: New frequently asked questions section. See below.
June, 25th: There will be no tutorial on Thursday, June 28.
July 15th: There will be no repeat exam in SS2018.
July 25th: No cheatsheets, calculators or other assistances are allowed in the exam.
1. Attendance to the lecture is open for all.
2. If your pursuing degree is not in Computer Science and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
3. If you are a LMU student and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
4. There is no way to get extra points for your final grade, such as bonus exercises, etc.
Location: MW 0350 (Egbert von Hoyer)
Date: Fridays, starting from April 13th
Time: 14.00 - 16.00
Lecturer: PD Dr. habil. Rudolph Triebel
Location: Interimshörsaal 2
Date: Thursdays, starting from April 19th
Time: 16.00 - 18.00
Lecturer: John Chiotellis, Maximilian Denninger
Time: 14.00 - 15.00
In this lecture, the students will be introduced into the most frequently used machine learning methods in computer vision and robotics applications. The major aim of the lecture is to obtain a broad overview of existing methods, and to understand their motivations and main ideas in the context of computer vision and pattern recognition.
Note that the lecture has a new module number now. In earlier semesters it was IN3200, now it is IN2357. The content is however (almost) the same. For material from previous semesters, please refer to, e.g.: WS2017
|Topic||Lecture Date||Tutorial Date|
|Introduction / Probabilistic Reasoning||13.04||19.04|
|Graphical Models I||27.04||03.05|
|Graphical Models II||04.05||10.05|
|Bagging and Boosting||11.05||17.05|
|Sequential Data / Hidden Markov Models||1.06||07.06|
|Kernels and Gaussian Processes||08.06||14.06|
|Variational Inference I||29.06||05.07|
|Variational Inference II||06.07||12.07|
Linear Algebra, Calculus and Probability Theory are essential building blocks to this course. The homework exercises do not have to be handed in. Solutions for the programming exercises will be provided in Python .
1. Introduction and Probabilistic Reasoning
3. Graphical Models I
4. Graphical Models II
5. Boosting and Bagging
6. Metric Learning
7. Deep Learning Notebooks UPDATED
9. Kernel Methods and Gaussian Processes
10. Clustering I: K-means and EM for GMMs
11. Clustering II: Dirichlet Process and Spectral Clustering
12. Variational Inference I: Mean Field
13. Variational Inference II: Expectation Propagation / Sampling I
14. Sampling II
12. Deep Learning with Solutions UPDATE