Welcome to CS 15-759

A Principled Approach to Optimization

About the Course

This is a course giving a rigorous treatment of several topics in the theory of convex optimization. There will be a particular focus on developing intuition for how to analyze many convex optimization processes from first principles. Topics may include: gradient descent, interior point methods, linear regression, linear programming, sparsification, and more.

Grading

Tentatively:

Schedule

Class meetings: TBD. Office hours: TBD.

Obviously, subject to change.
Week Topics Homework Notes
1 Introduction: convexity, linear algebra, and geometry. Linear programming.
2 2 gradient descent, Richardson, smoothness and strong convexity
3 p gradient descent, smoothness and strong convexity of p-norms, softmax
4 p-norm methods, Bregman divergences
5 Multiplicative weights and energy boosting for ℓp-regression
6 Speeding up p-norm methods: sparsification
7 Speeding up p-norm methods: Lewis weights
8 High-accuracy linear programming: potential reduction interior point methods (IPM)
9 Path-following interior point methods and self-concordance
10 Lewis weights and IPM: Lee-Sidford barrier
11 Convex geometry: log-concavity, cutting plane methods
12 Acceleration
13 Mirror descent and applications
14 TBD

Additional Resources

Contact

For any questions or concerns, please email yangpliu@cs.cmu.edu.