Optimization
TOC
Introduction
Convex optimization is a subfield of mathematical optimization that deals with the minimization of convex functions over convex sets. Convex optimization problems have a unique global minimum, which is a highly desirable property for optimization algorithms. The field has many applications in areas such as machine learning, signal processing, statistics, finance, and operations research.
Usually the problem is to find an x that minimize a function
minimize
We call
The set of points for which all the functions are defined is called the domain. A point
If
If the optimal point exists, the problem is solvable. The unbounded below problem is not solvable. A feasible point x with
A local optimal point is a feasible point that solves the minimum problem
In other words, x minimize
The standard form of an optimization is when you arrange the functions and inequalities so that the right hand size is 0. For example, for an equality constraint
Equivalence
The two problems are equivalent if from the solution of one, the solution of the other is easily found and vice versa. One way to make an equivalent problem is to scale the functions and constraints:
minimize
As a result of the scaling, the feasible sets of the problem and the original one is the same. There are some other ways of transformation that make equivalent problems.
- Change of variables: when we substitute variable
the problem becomes an equivalent one:
minimize
If x solves the problem, then
- Transformation of objective and constraint functions: let
. We come up with an equivalent problem:
minimize
For example the unconstrained Euclidean norm minimization problem
-
Adding slack variables so that the inequalities become equalities. We have: minimize
subject to . -
Transform the equality constraints into
so that the problem becomes: minimize subject to -
Introduce new equality constraint
: minimize subject to -
Optimizing sequentially over variables. For example, we can optimize over y first, then x:
with . -
By changing into epigraph form: minimize t subject to
-
By restricting the domain of the new unconstraint problem, For example we have a new unconstraint objective: to minimize F(x) but with the feasible set restraint by previous inequalities.
Convex optimization
From the standard optimization problem, the convex optimization problem requires extra conditions:
-
The objective function must be convex
-
The inequality constraint functions must be convex
-
The equality constrain functions
must be affine
Since the domain of the problem is convex, the feasible set of a convex optimization problem is also convex.
For the convex optimization problem, any local optimal point is also global. We prove this by contradictory. If x is feasible and locally optimal, then
Linear optimization
When the objective and constraint functions are all affine, the problem is called a linear program. A general linear program has the form:
minimize
Image: geometric interpretation of an LP. The point
There are two cases of LP that are popular:
-
A standard form LP: minimize
subject to . -
An inequality form LP: minimize
subject to .
Here is how to convert LPs to standard form:
-
The first step is to introduce slack variables
for the inequalities: minimize , subject to . -
The second step is to express the variable x as the difference of two nonnegative variables y and z (
): minimize subject to .
Quadratic optimization
The convex optimization problem is called a quadratic program if the objective function is (convex) quadratic and the constraint functions are affine:
minimize
Image: Geometric illustration of QP. The point
Geometric programming
Geometric programming is a family of optimization problems that naturally are not convex but can be transformed into a convex problem, by changeing of variables and transformation of objective and constraint functions.
A function
Then the optimization problem of the form: minimize
- To transform the GP into a convex optimization problem, we first change the variables:
Let’s
- The second step is to transform the objective and constraint functions by taking logarithm:
minimize