Math 301
Advanced Topics in Convex Optimization
Winter 2015



Instructor
Emmanuel Candes
113 Sequoia Hall

Office Hours: M 1:30-2:30
      or by appointment

   

Lectures
Monday, Wednesday, Friday
11:00-11:50 a.m.
Blume Earthquake Center, room 108

 

Home

Handouts

Homework


Description: The main goal of this course is to expose students to modern and fundamental developments in convex optimization, a subject which has experienced tremendous growth in the last 20 years or so. This course builds on EE 364 and explores two distinct areas. The first concerns cone programming and especially semidefinite programming whose rich geometric theory and expressive power makes it suitable for a wide spectrum of important optimization problems arising in engineering and applied science. The second concerns novel and efficient first-order methods, e.g. Nesterov's method, for smooth and nonsmooth convex optimization which are suitable for large-scale problems.

This is an advanced topics course, which will hopefully bring students near the frontier of current research. Students are expected to complete a (short) research project to receive credit.


Prerequisite:
EE 364 (Convex optimization), Math 104 (Linear algebra), Stats 217 or equivalent (First course in stochastic processes) or consent of instructor.


Syllabus:

  • Conic programming: linear programming (LP), second-order cone programming (SOCP), semidefinite programming (SDP), linear matrix inequalities, conic duality, conic duality theorem.
  • Interior point methods: barrier functions, primal-dual path following methods for LP, SOCP and SDP.
  • Applications of semidefinite programming: control and system theory, polynomial optimization, combinatorial and nonconvex optimization, machine learning.

  • The proximal operator and its properties.
  • Proximal gradient methods for smooth and non-smooth convex optimization, complexity analysis.
  • Optimal first-order methods (Nesterov's method and its variants), complexity analysis.
  • Nonsmooth convex optimization: conjugate functions, Moreau's regularization, smooth approximations of nonsmooth functions by conjugation.
  • Augmented Lagrangian methods and alternating direction method of multipliers (ADMM).
  • Proximal minimization and mirror-descent algorithms (MDA).
  • Example problems in statistics, signal and image processing, control theory...


Textbooks:


We will not require a textbook. The references below contain many of the topics we shall cover. In particular, the book by Ben-Tal and Nemirovskii is an excellent companion for the first part of the course.

  1. Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications by A. Ben-Tal and A. Nemirovski, MPS-SIAM Series on Optimization.
  2. Introductory Lectures on Convex Optimization: A Basic Course by Y. Nesterov, Kluwer Academic Publisher.
  3. Convex Optimization by S. Boyd and L. Vandenberghe, Cambridge University Press.
  4. Nonlinear Programming by D. Bertsekas, Athena Scientific.
  5. Convex Analysis and Optimization by D. Bertsekas, A. Nedic, and A. Ozdaglar Convex Analysis, Athena Scientific.
  6. Optimization Models by G. Calafiore and L. El Ghaoui, Cambridge Univ. Press.

Handouts: All handouts will be posted online.

Teaching assistants and office hours:

  • Ernest Ryu () Office hours T 1-2, 380-U Building 380.
  • Weijie Su () Office hours M 1-2 and W 1-2, 238 Sequoia Hall.

Grading: We may have a few homework projects (to be determined later). The grade will be determined by these problem sets (if any) and a final project that students will have the freedom to select from a list according to their own interest. Students can also define their own project after communicating with the instructor.