Die-worker annealing method

Source: Internet
Author: User
Summary
This article introduces the basic idea of simulated annealing, based on the choice of the main parameters during simulation, then gives a specific problem and solution for finding the extreme values of two-dimensional functions, and gives the C # source code.

L Overview

In management science, computer science, molecular physics and biology, as well as ultra-large scale integrated circuit design, code design, image processing, electronic engineering and other scientific and technological fields, there are a lot of Combined Optimization tips. Many of these problems, such as the question of the carrier, the problem of graph coloring, the problem of device layout and wiring, have not found an effective polynomial time algorithm. These problems have been proven to be NP-complete.

In 1982, Kirkpatrick introduced the annealing idea into the field of composite optimization and proposed an algorithm to solve the large-scale composite optimization problem, which is particularly effective for the NP full composite optimization problem. This is due to the solid annealing process, that is, first adding the temperature to a high, then slowly cooling (that is, annealing), to reach the lowest point of energy. If the rapid cooling (quenching), it cannot reach the lowest point ..

Interpretation in [1]: simulation annealing is a technique which can be applied to any minimisation or learning process based on successive update steps (either random or deterministic) where the update step length is proportional to an arbitrarily set parameter which can play the role of a temperature. then, in analogy with the annealing of metals, the temperature is made high in the early stages of the process for faster minimisation or learning, then is already CED for greater stability.

That is, the simulated annealing algorithm is a learning process (random or decisive) that can be applied to the problem of minimum value or basic previous updates ). In this process, the length of each update process is proportional to the corresponding parameters, which play the role of temperature. Then, similar to the metal annealing principle, in the initial stage, in order to minimize or learn more quickly, the temperature is increased to a high level before (SLOWLY) Cooling for stability.

L main idea of Simulated Annealing Algorithm

For the minimum value of a function, the main idea of simulated annealing is: Random Walk (Random Choice Point) between search zones (two-dimensional plane), and then use the Metropolis sampling criterion, the Random Walk gradually converges to the local optimal solution. Temperature is an important control parameter in the Metropolis algorithm. It can be considered that the size of this parameter controls the speed at any time when the process moves to the local or global optimal solution.

The cooling parameter table, domain structure, and new solution generator, acceptance criteria, and random number generator (I .e. the Metropolis algorithm) constitute three pillars of the algorithm.

L key sampling and metroplis algorithms:

Metropolis is an effective key sampling method. Its algorithm is as follows: when the system changes from one energy state to another, the corresponding energy changes from E1 to E2, the probability is P = exp [-(E2-E1)/Kt]. If E2 <E1, The system receives this status. Otherwise, it receives this status with a random probability or discards this status. This type of iteration is often performed for a certain number of times, and the system will gradually become a stable distribution.

When sampling with a focus, if the new State is downward, it will be accepted (local optimum). If it is upward (global search), it will be accepted with a certain probability. The simulated annealing method starts from an initial solution. After a large number of transformations, the relative optimal solution of the combined optimization problem can be obtained when the control parameter value is given. Then, reduce the value of the control parameter T and execute the Metropolis algorithm repeatedly to obtain the overall optimal solution of the combined optimization problem when the control parameter t tends to be zero. The value of the control parameter must fade slowly.
Temperature is an important control parameter of Metropolis. Simulated Annealing can be regarded as an iteration of the metroplis algorithm when the decreasing control parameter is used. At the beginning, the tvalue is large and may accept a poor deterioration solution. As T decreases, it can only accept a better deterioration solution. At last, when T tends to 0, it will not accept any deterioration solution.

When the temperature is infinitely high, the system immediately distributes evenly and accepts all the proposed transformations. The smaller the T attenuation, the longer the T arrives at the end. However, the smaller the Markov chain, the shorter the time it takes to reach the quasi-equilibrium distribution,

L parameter selection:

We call it a series of important parameters for adjusting the simulated annealing method as the cooling schedule. It controls the initial value of parameter T and its attenuation function, corresponding to the Markov chain length and stop conditions, and is very important.

A cooling schedule should specify the following parameters:

1. Initial Value t0 of control parameter T;

2. the attenuation function that controls parameter T;

3. The length of the Markov Chain LK. (That is, the number of iterations of each random walk process tends to a quasi-balanced distribution, that is, a local convergence solution location)

4. Selection of end conditions

Effective cooling schedule criteria:
1. algorithm convergence: mainly depends on the length of the attenuation function and the Markov chain and the selection of the Stop criterion.
Ii. algorithm experiment performance: Final Solution quality and CPU time
Parameter Selection:
1) Select the initial value t0 of the Control Parameter
Generally, the T0 value must be sufficiently large, that is, the initial value is in the high temperature state, and the receiving rate of Metropolis is about 1.
2) Selection of attenuation Functions
The attenuation function is used to control the annealing speed of temperature. A common function is T (n + 1) = K * t (N), where K is a constant very close to 1.
3) selecting the length of the Markov chain l

The principle is: if the attenuation function of the attenuation parameter T has been selected, the quasi-equilibrium can be restored for each value of the control parameter.

Iv) Conditions for termination

There are many options for termination conditions. Different conditions have a great impact on the performance and quality of algorithms. This article only introduces one common termination condition. That is, the difference between the previous optimal solution and the latest optimal solution is less than a certain tolerance, and the iteration of the Markov chain can be stopped.

Evaluate the minimum value of function f (x, y) = 5sin (xy) + X2 + y2 Using Simulated Annealing

Solution: Based on the question, the cooling table schedule is as follows:
That is, the initial temperature is 100

The attenuation parameter is 0.95.

The length of the Markov chain is 10000.

The step of Metropolis is 0.02

The ending condition is that the difference between the previous optimal solution and the latest optimal solution is less than a certain tolerance.

Use the Metropolis acceptance criteria for simulation. The procedure is as follows:
/*

* The simulated annealing method is used to obtain the minimum value of the function f (x, y) = 5sin (xy) + x ^ 2 + y ^ 2.
* The ending condition is that the difference between the two optimal solutions is smaller than a small amount.

*/

Using system;

Namespace simulateannealing

{

Class class1

{

// Target function that requires the optimal value

Static double objectfunction (Double X, Double Y)

{

Double Z = 0.0;

Z = 5.0 * Math. Sin (x * Y) + x * x + y * Y;

Return Z;

}

[Stathread]

Static void main (string [] ARGs)

{

// Maximum search Interval

Const double xmax = 4;

Const double Ymax = 4;

// Cooling Table Parameters

Int markovlength = 10000; // Markov Chain Length

Double decayscale = 0.95; // attenuation parameter

Double stepfactor = 0.02; // Step Size Factor

Double temperature = 100; // initial temperature

Double tolerance = 1e-8; // tolerance

Double prex, nextx; // prior and next value of X

Double prey, nexty; // prior and next value of Y

Double prebestx, prebesty; // The Last optimal solution.

Double bestx, besty; // Final Solution

Double acceptpoints = 0.0; // The total number of receiving points in the metropolis process.

Random RND = new random ();

// Random vertex Selection

Prex =-xmax * RND. nextdouble ();

Prey =-Ymax * RND. nextdouble ();

Prebestx = bestx = prex;

Prebesty = besty = prey;

// Annealing each iteration once (cooling) until iteration conditions are met

Do

{

Temperature * = decayscale;

Acceptpoints = 0.0;

// Iterations of loop (that is, the length of the Markov chain) at the current temperature T

For (INT I = 0; I <markovlength; I ++)

{

// 1) Select a random point near this point

Do

{

Nextx = prex + stepfactor * xmax * (RND. nextdouble ()-0.5 );

Nexty = prey + stepfactor * Ymax * (RND. nextdouble ()-0.5 );

}

While (! (Nextx> =-xmax & nextx <= xmax & nexty> =-Ymax & nexty <= Ymax ));

// 2) whether the global optimal solution is used

If (objectfunction (bestx, besty)> objectfunction (nextx, nexty ))

{

// Retain the last Optimal Solution

Prebestx = bestx;

Prebesty = besty;

// This is the new optimal solution.

Bestx = nextx;

Besty = nexty;

}

// 3) metropolis Process

If (objectfunction (prex, prey)-objectfunction (nextx, nexty)> 0)

{

// Accept. Here, lastpoint is the point of the next iteration starting with the new accepted point.

Prex = nextx;

Prey = nexty;

Acceptpoints ++;

}

Else

{

Double change =-1 * (objectfunction (nextx, nexty)-objectfunction (prex, prey)/temperature;

If (math. exp (Change)> RND. nextdouble ())

{

Prex = nextx;

Prey = nexty;

Acceptpoints ++;

}

// Do not accept. Save the original Solution

}

}

Console. writeline ("{0}, {1}, {2}, {3}", prex, prey, objectfunction (prex, prey), temperature );

} While (math. Abs (objectfunction (bestx, besty)-objectfunction (prebestx, prebesty)> tolerance );

Console. writeline ("minimum value at: {0}, {1}", bestx, besty );

Console. writeline ("minimum value: {0}", objectfunction (bestx, besty ));

}

}

}

L result:

Minimum value:-1.07678129318956, 1.07669421564618

Minimum value:-2.26401670947686

L Postscript:

I originally wanted to write a series of articles to introduce how to use C # To solve the numerical problem. This is because there are few articles on numerical computing in csdn, so I hope to make some supplements.

At first, I searched for the simulated annealing information on the Internet and wanted to use it as an example of C # numerical calculation. I couldn't find the ready-made source code. After a long experiment, I finally wrote this program and did not dare to hide it in private. I used it as an example for simulating annealing or using C # To solve the numerical algorithm problem.

This article tries to avoid being too academic, such as the names and formulas of mathematics and physics. It may not be clear in many places. I hope you will be considerate. Any questions or criticism can be emailed with me: armylau2@163.com

In addition, simulated annealing can be applied to other more complex problems, such as the "Salesman Problem" and other combination optimization problems. In this example, the minimum value of a two-dimensional function is obtained, and the parameter selection of the cooling table is too simple. It can only serve as a preliminary introduction. Please note.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.