Posts by Tags

opttheory

Theory of Optimization: Projected (Sub)Gradient Descent

3 minute read

Published:

In this post, we will continue our analysis for gradient descent. In the previous lecture, we assume that all of the functions has -Lipschitz gradient. For general -smooth functions, the gradient descent algorithm can get a first order -critical proint in iteration. When the function is convex, we show that we can use iterations to get a solution differ from the optimal for at most . When the function is strongly convex and smooth, we show that the number of iterations can be reduced to . However, in the previous post, we assume that the function is smooth, which implies that the function has gradient at all points. In this post, we will first assume that the function is convex but not smooth. Besides, in the previous post, we also focus on the unconstraint case, and in this posts, we will also introduce the analysis for constraint minimization. Read more

Theory of Optimization: Gradient Descent

3 minute read

Published:

In this post, we will review the most basic and the most intuitive optimization method – the gradient decent method – in optimization.

Gradient Descent

The gradient descent algorithm works as follow: The algorithm requires an initial point and step size . Then the algorithm repeats to execute: until . In the following of this section, we will assume that the gradient of is L-lipschitz, i.e. we call is L-lipschitz gradient or L-smooth. Read more

Theory of Optimization: Preliminaries and Basic Properties

4 minute read

Published:

Recently, I find an interesting course taught by Prof. Yin Tat Lee at UW. The course is called `Theory of Optimization and Continuous Algorithms’, and the lecture notes are available under the homepage of this courseuw-cse535-winter19. As a great fan of optimization theory and algorithm design, I think I will follow this course and write a bunch of blogs to record my study of this course. Most of the materials in this series of blogs will follow the lecture notes of the course, and and interesting optimization book Convex Optimization: Algorithms and Complexity by Sebastien Bubeck. Since this is the first blog about this course, I will present the preliminaries of the optimization theory, and some basic knowledge about convex optimization, including some basic properties of convex functions. Read more