Machine Learning for Computer Vision (IN2357) (2h + 2h, 5ECTS)
WS 2018, TU München
You can use our library for the programming exercises: mlcv-tutorial
October, 12th: First tutorial will be on November 8th.
November, 6th: Link for piazza: https://piazza.com/ tum.de/fall2018/ in2357
1. Attendance to the lecture is open for all.
2. If your pursuing degree is not in Computer Science and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
3. If you are a LMU student and you want to take the exam, you should ask the administrative staff responsible for your degree whether that is possible (it most probably is).
4. There is no way to get extra points for your final grade, such as bonus exercises, etc.
Location: 5620.01.102 Interims Hörsaal 2
Date: Fridays, starting from October 19th
Time: 16.00 - 18.00
Lecturer: PD Dr. habil. Rudolph Triebel
Location: 2501 (Hörsaal 1), Building: 5101 Physik I
Date: Thursdays, starting from November 8th
Time: 16.00 - 18.00
Lecturer: John Chiotellis, Maximilian Denninger
In this lecture, the students will be introduced into the most frequently used machine learning methods in computer vision and robotics applications. The major aim of the lecture is to obtain a broad overview of existing methods, and to understand their motivations and main ideas in the context of computer vision and pattern recognition.
Note that the lecture has a new module number now. In earlier semesters it was IN3200, now it is IN2357. The content is however (almost) the same. For material from previous semesters, please refer to, e.g.: WS2017
|Topic||Lecture Date||Tutorial Date|
|Introduction / Probabilistic Reasoning||19.10||08.11|
|Graphical Models I||02.11||08.11|
|Graphical Models II||09.11||15.11|
|Bagging and Boosting||16.11||22.11|
|Kernel Regression and Gaussian Processes||30.11||6.12|
|Gaussian Processes for Classification||14.12||20.12|
|Variational Inference I||18.01||24.01|
|Variational Inference II||25.01||31.01|
Linear Algebra, Calculus and Probability Theory are essential builex3_pgms.pdfding blocks to this course. The homework exercises do not have to be handed in. Solutions for the programming exercises will be provided in Python .
1. Introduction to Probabilistic Reasoning
3. Logistic Regression Graphical Models I
4. Graphical Models II
5. Boosting and Bagging
6. Metric Learning
7. Kernel Methods and Gaussian Processes
8. Deep Learning Notebooks