Convex Optimization for Machine Learning and Computer Vision (IN2330) (2h + 2h, 6 ECTS)
* The final exam will be oral and take place on the afternoons of the 19th, 21st, and 23rd of February. The preliminary schedule is available with the lecture material. Send us an email if you did not register in class for a slot.
* On the 5th of February, we will have a summary lecture. Please send us an email with any question that you would like to see revised that day.
* The first exercise class after Christmas takes place on Friday, January, 19-th.
* The class of December the 4th is cancelled. Please hand in the exercises directly to Emanuel at his office (02.09.039).
* We are creating course notes that we will update as we make progress on the course material. Check the link to the lecture material.
* The first exercise sheet that counts to the exam bonus, exercise sheet 1, is online. It is due Monday, 30th of October.
* Note that the exercise lectures start on October the 27th.
Many important machine learning and computer vision can be formulated as convex optimization problems, e.g. training of SVMs, logistic regression, low-rank and sparse matrix decomposition, image segmentation, stereo matching, surface reconstruction, etc. In this lecture we will discuss first order convex optimization methods to implement and solve the aforementioned problems efficiently. Particular attention will be paid to problems including constraints and non-differentiable terms, giving rise to methods that exploit the concept of duality such as the primal-dual hybrid gradient method or the alternating directions methods of multipliers. This lecture will cover the mathematical background needed to understand why these methods converge as well as the details of their efficient implementation.
We will cover the following topics:
Elements in convex analysis
- Convex sets and convex functions
- Existence and uniqueness of minimizers
- Convex conjugates
- Saddle point problems and duality
- Gradient-based methods
- Proximal algorithms, primal-dual hybrid gradient method, alternating direction method of multipliers
- Convergence analysis
- Acceleration techniques, stopping criteria
- Introduction to Stochastic Optimization: Stochastic Gradient Method
Example applications in machine learning and computer vision include
- Low-rank and sparse matrix decomposition
- Training of SVMs, Logistic regression
- Image reconstruction (e.g. denoising, deblurring, inpainting)
- Surface Reconstruction
We will implement some of them in MATLAB
Location: Room 02.09.023
Time and Date: Monday 10:15 - 12:00
Start: October 16th, 2017
Lecturer: Dr. Virginia Estellers
The lecture is held in English.
Location: Room 02.09.023
Time and Date: Friday 09:15 - 11:00
Organization: Emanuel Laude
Start: October 27th, 2016
The exercise sheets consist of two parts, theoretical and programming exercises.
Exercise sheets will be posted every Monday and are due a week later. You will have a week to do the exercises.
Please submit the programming solutions as a zip file with filename “matriculationnumber_firstname_lastname.zip” only! containing your .m-files (no material files) via email to email@example.com, and hand in the solutions to the theoretical exercises in Monday's lecture. We will give you back the corrected sheets on Friday when we discuss them in class.
Please remember to write clean, commented(!) code! You are allowed to work on the exercise sheets in groups of two students.
The first exercise sheet that counts to the exam bonus is exercise sheet 1, which is due Monday, 30th of October.
The exercise sheets can be accessed here.
Depending on the number of students, the exam will be written or oral.
Course material (slides and exercise sheets) can be accessed here.
Send us an email if you need the password.