Convex Optimization for Machine Learning and Computer Vision (IN2330) (2h + 2h, 6 ECTS)
- The repeat exam is scheduled on October 8th.
- Sheet 10 is the last exercise sheet. The tutorial on July 11th will be a Q&A session.
- On July 9th, Emanuel Laude will present a special session on “Stochastic Gradient Descent”.
- The lecture on May 21st is shifted to May 23rd (Wednesday) at 12:15. The exercise session in that week is cancelled.
- The due-date for the exercises is Monday before class at 16:15. The first sheet that counts to the exam bonus is sheet 1, due on Monday (April 23rd). See below!
- The course web is permanently moved to the address: http://vision.in.tum.de/teaching/ss2018/cvx4cv
- There is neither lecture nor exercise class in the week Apr 9-13 due to the AISTATS Conference.
Many important machine learning and computer vision tasks can be formulated as convex optimization problems, e.g. training of SVMs, logistic regression, low-rank and sparse matrix decomposition, image segmentation, stereo matching, surface reconstruction, etc. In this lecture we will discuss first-order convex optimization methods to solve the aforementioned problems efficiently. Particular attention will be paid to problems including constraints and non-differentiable terms, giving rise to methods that exploit the concept of duality such as the primal-dual hybrid gradient method and the alternating directions methods of multipliers. This lecture will cover the mathematical background needed to understand why these methods converge as well as the details of their efficient implementation.
We will cover the following topics:
Elements in convex analysis
- Convex sets and convex functions
- Existence and uniqueness of minimizers
- Convex conjugates
- Saddle point problems and duality
- Gradient-based methods
- Proximal algorithms, primal-dual hybrid gradient method, alternating direction method of multipliers
- Convergence analysis
- Acceleration techniques, stopping criteria
Examplary applications in machine learning and computer vision include
- Training of SVMs, Logistic regression
- Image reconstruction (e.g. denoising, deblurring, inpainting)
- Low-rank and sparse matrix decomposition
We will implement some of them in MATLAB.
Location: Room 02.09.023
Time and Date: Monday 16:15 - 18:00
Start: April 16th, 2018
Lecturer: Dr. Tao Wu
The lecture is held in English.
The exercise sheets consist of two parts, theoretical and programming exercises.
Exercise sheets will be posted every Monday and are due a week later. You will have a week to do the exercises.
Please submit the programming solutions as a zip file with filename “matriculationnumber_firstname_lastname.zip” only! containing your code-files (no material files) via email to firstname.lastname@example.org, and hand in the solutions to the theoretical part in Monday's lecture. We will give you back the corrected sheets on Wednesday when we discuss them in class.
Please remember to write clean, commented(!) code! You are allowed to work on the exercise sheets in groups of two students.
The first exercise sheet that counts to the exam bonus is exercise sheet 1, which is due Monday, 23rd of April.
The exercise sheets can be accessed here.
You can improve your final grade by one step if you once present a solution to a theoretical problem during the exercise classes, and you obtain at least 75 % out of the total amount achievable points, excluding bonus points. For some assignments, indicated by (0 + ? points) you can obtain bonus points that do not count to the 75 % limit. You can neither improve a grade > 4.0 nor 1.0.
Date: July 18th, 13:30 - 15:30. Place: 102, Interims Hörsaal 2 (5620.01.102).
The final exam will be written. You may bring one handwritten double-sided A4-size cheatsheet. You are not allowed to use any electronic devices.
Date: October 8th, 13:00 - 15:00. Place: MI Hörsaal 3 (5606.EG.011).
The repeat exam will be written. You may bring one handwritten double-sided A4-size cheatsheet. You are not allowed to use any electronic devices.
Course material (slides and exercise sheets) can be accessed here.
Send us an email if you need the password.