InfoCoBuild

EE364B - Convex Optimization II

EE364B: Convex Optimization II (Stanford Univ.). Taught by Professor Stephen Boyd, this course concentrates on recognizing and solving convex optimization problems that arise in engineering. Continuation of Convex Optimization I. Subgradient, cutting-plane, and ellipsoid methods. Decentralized convex optimization via primal and dual decomposition. Alternating projections. Exploiting problem structure in implementation. Convex relaxations of hard problems, and global optimization via branch & bound. Robust optimization. Selected applications in areas such as control, circuit design, signal processing, and communications. Course requirements include a substantial project. (from see.stanford.edu)

Lecture 12 - Recap: Difference Of Convex Programming, Conjugate Gradient Method


Go to the Course Home or watch other lectures:

Lecture 01 - Introduction, Subgradients
Lecture 02 - Subgradients (cont.)
Lecture 03 - Convergence Proof, Subgradient Methods, Linear Equality Constraints
Lecture 04 - Subgradient Method for Constrained Optimization, Convergence
Lecture 05 - Stochastic Programming, Localization and Cutting-Plane Methods
Lecture 06 - Analytic Center Cutting-Plane Method, Infeasible Start Newton Method Algorithm
Lecture 07 - ACCPM With Constraint Dropping, Ellipsoid Method
Lecture 08 - Recap: Ellipsoid Method, Primal Decomposition, Dual Decomposition
Lecture 09 - Recap: Primal Decomposition, Dual Decomposition
Lecture 10 - Decomposition Applications
Lecture 11 - Sequential Convex Programming
Lecture 12 - Recap: 'Difference Of Convex' Programming, Conjugate Gradient Method, Krylov Subspace
Lecture 13 - Recap: Conjugate Gradient Method and Krylov Subspace, Truncated Newton Method
Lecture 14 - Truncated Newton Method, L1-Norm Methods
Lecture 15 - L1-Norm Methods
Lecture 16 - Model Predictive Control
Lecture 17 - Stochastic Model Predictive Control, Branch and Bound Methods
Lecture 18 - Branch and Bound Methods