Basic Concepts in convex optimization (1)

Source: Internet
Author: User

1.1 What is a convex set?

In simple terms, a convex set is a point set, which has a property of taking different two points x and y in this set, and the points on the line between them (including the endpoints) belong to this point set, then it is said that the point set is a convex set.

For example, the left side of the graph is a convex set, and the right is not, because we can find two points, so that the points on the line between them are not in the collection

Mathematically, the convex set is defined as follows:

Given the collection $c$,$\forall x,y\in c$,$0\leq\theta\leq 1$, if there is

$$ \theta x + (1-\theta y) \in c$$

We call the set C a convex set, we put the point $\theta x + (1-\theta y) $ called the convex combination of x and Y.

1.2 What is a convex function?

Suppose there is a function $f:\mathbb{r}^n\to \mathbb{r}$, which defines the domain as $\mathcal{d} (f) $, if $\mathcal{d} (f) $ is a convex set, and in which any two points $x,y$, satisfy the following properties:

$$ f (\theta x + (1-\theta y)) \leq \theta f (x) + (1-\theta) f (y) $$

Then we call $f$ a convex function.

Note: It is not necessary to define a domain as a convex set, and its starting point is simply to make the convex combination of x, y defined

About convex functions, intuitively, can be used to deepen understanding:

To put it simply, we take two dots x, y in the definition field, connect them to get a line segment, and if the point on that segment is above the corresponding function value, we say that the function is a convex function.

Further, if $x\neq y$ and $0<\theta <1$ we call $f$ strictly convex. If $-f$ is a convex function, then $f$ is the concave function. If $-f$ is a strictly convex function, then $f$ is a strictly concave function.

Equivalent discriminant method of 1.3 convex function

Above we talk about what is a convex function, but this definition in reality is difficult to determine whether a function is convex, so introduce a few equivalent definitions.

1.3.11-Step approximation

Assuming that the function $f:\mathbb{r}^n\to \mathbb{r}$ is a derivative (that is, the $f (x) $ gradient $\nabla_x F (x) $ exists on the entire defined domain), then $f$ is a convex function when and only if its defined field is a convex set, and for all $x,y\ In\mathcal{d} (f) $ has the following form:

$$ f (Y) \geq f (x) +\nabla_x f (x) ^t (y-x) $$

We will $f (x) +\nabla_x f (x) ^t (y-x) $ called the first order approximation of F, whose physical meaning is actually through the tangent plane of point x, we use the point on this tangent surface to approximate $f (y) $. The meaning of this formula is that if f is a convex function, then its first order approximation is always below the value of the function.

1.3.22-Step approximation

Assuming that the function $f:\mathbb{r}^n\to \mathbb{r}$ differentiable (that is, the sea-slug matrix is defined on the definition field), F is a convex function when and only if its domain is a convex set and its sea-slug matrix is semi-positive, that is:

$$ \nabla^2_x f (x) \succeq 0$$

Some students may have forgotten what the sea plug matrix looks like, here to mention. Assuming our variables come from n-dimensional space, $x\in\mathbb{r}^n$, we remember $x= (x_1,x_2,..., x_n) =\{x_i\}_{i=1}^n$, which is a vector of n variables. Then the sea Plug matrix (recorded as H bar) is a $n\times n$ block matrix, and

$$ h_{ij}=\frac{\partial^2 f (x)}{\partial x_i\partial X_j} $$

In other words, the $H _{ij}$ is f (x), respectively, to $x_i$ and $x_j$ derivative two obtained.

1.4 Convex optimization problems

The convex sets and convex functions have been described above, is it time to convex optimization? Don't worry, before you introduce the convex optimization concept, two more words.

1.4.1 Horizontal subset (sublevel sets)

Starting from the concept of convex function, we can draw the concept of horizontal subset (Sublevel set). Suppose f (x) is a convex function, given a real $\alpha\in\mathbb{r}$, we put the set

$$ \{x\in\mathcal{d} (f) | f (x) \leq \alpha\}$$

Called the $\alpha-$ level subset. That is, the $\alpha$ horizontal subset is the set of all points that satisfy $f (x) \leq \alpha$. Using the convex function nature, we can prove that a horizontal subset is also a convex set:

$$ F (\theta x+ (1-\theta y)) \leq \theta f (x) + (1-\theta) f (y) \leq \theta \alpha + (1-\theta) \alpha=\alpha $$

The horizontal subset tells us to add an upper bound to the convex function that defines the set of points that are left in the field or a convex set.

1.4.2 affine function (affine functions)

Mathematically, we take the shape as

$$ h (x) =ax+b$$

function is called an affine function. Among them, $A _{n\times m}$, a vector $b\in\mathbb{r}^m$. Intuitively, the affine function maps a vector of an n-dimensional space through a linear transformation A to m-dimensional space, and on the basis of which it adds a vector b, it is translated.

Similarly, we can prove that the point set

$$ \{x\in\mathcal{d} (h) | h (x) = 0\}$$

is a convex set, which proves slightly.

1.4.3 Convex optimization (convex optimization)

So back to the convex optimization problem, what is a convex optimization problem?

A convex optimization problem can be defined as:

Where f is a convex function and C is a convex set. Based on the concepts of the horizontal subset described earlier, the above problem can be equivalent to:

where g (x) is a convex function, and H (x) is the affine function. In other words, the original constraint set C is represented by us as the intersection of a series of convex sets (mathematically proving that the intersection of a convex set is a convex set).

1.4.4 Local-Optima and global-optimal (globally optima)

local optimality : There is no small point around me in a small area.

Mathematical definition:

If there is a $r>0$, for all z:$\left\|x-z\right\|_2<r$, there is $f (x) \leq f (z) $, then it is said that X is a local optimal point.

Global Best : I'm the smallest point in the entire definition field.

Mathematical definition:

If you have $f (x) \leq f (z) $ for all z in the defined field, then X is the global optimal.

Now back to the convex optimization problem, there is a very important conclusion for the convex optimization problem:

for convex function, local optimality is the global optimal . The proof is as follows:

We use contradiction to prove it. $x$ is a local optimization, but not a global optimal, so we assume that the global optimal is $z^*$, then we have $f (x) >f (z^*) $

By the local optimal properties of X, we have:

Existence of $r>0$, for all z:$\left\|x-z\right\|_2<r$, there is $f (x) \leq f (z) $

We consider the convex combination of $x$ and $z^*$: $z =\theta x+ (1-\theta) z^*$, wherever $z^*$ is, we can always find a $\theta$ that makes $z$ in the neighborhood of $x$, making $f (x) \leq f (z) $

On the other hand, by the convex function nature, we have:

$$ f (z) =f (\theta x+ (1-\theta) z^*) \leq\theta f (x) + (1-\theta) f (z^*) <\theta f (x) + (1-\theta) f (x) =f (x) $$

This is $f (z) <f (x) $, which contradicts $f (x) \leq f (z) $, so we prove that if $x$ is a local optimal, then it is also the global optimal.

Reference:

Http://cs229.stanford.edu/section/cs229-cvxopt.pdf

Basic Concepts in convex optimization (1)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.