Simulation annealing
In 1982, Kirkpatrick introduced the annealing idea into the field of composite optimization and proposed an algorithm to solve the large-scale composite optimization problem, which is particularly effective for the NP full composite optimization problem. This is due to the solid annealing process, that is, first adding the temperature to a high, then slowly cooling (that is, annealing), to reach the lowest point of energy. If the rapid cooling (quenching), it cannot reach the lowest point ..
The simulated annealing algorithm is a learning process (random or decisive) that can be applied to the problem of minimum value or basic previous updates ). In this process, the length of each update process is proportional to the corresponding parameters, which play the role of temperature. Then, similar to the metal annealing principle, in the initial stage, in order to minimize or learn more quickly, the temperature is increased to a high level before (SLOWLY) Cooling for stability.
The simulated annealing algorithm is a random search algorithm used to solve large-scale optimization problems. It is based on the similarity between the Problem Solving Process and the physical system annealing process; the optimized target function is equivalent to the internal energy of the metal. The state space of the independent variable combination of the optimization problem is equivalent to the internal energy state space of the metal. The problem is solved by finding a combined state to minimize the value of the target function. Using the Metropolis criterion and appropriately controlling the temperature descent process to achieve simulated annealing, the goal of solving the global optimization problem in polynomial time is achieved.
The simulated annealing algorithm is derived from the solid annealing principle. The solid is heated to a sufficiently high level, and then cooled slowly. when heated, the particles inside the solid become unordered with the temperature rise, and the internal energy increases, when slowly cooling, the particles gradually become orderly. When each temperature reaches the equilibrium state, and finally reaches the ground state at normal temperature, the energy can be reduced to a minimum. According to the Metropolis principle, the probability that the particle tends to be balanced at the temperature T is e-△e/(KT), where E is the internal energy when the temperature T, and △e is the amount of change, K is the Boltzmann constant. Using Solid-State annealing to simulate the combination optimization problem, the internal energy E is simulated as the target function value F, and the temperature T is evolved into the control parameter T, that is, the simulated annealing algorithm for solving the combination optimization problem is obtained: start from initial solution I and initial value T of the control parameter, repeat the iteration of the current solution to "generate a new solution → calculate the target function deviation → accept or discard", and gradually decrease the tvalue, the current solution at algorithm termination is the obtained approximate optimal solution. This is a heuristic random search process based on the Monte Carlo iterative solution. The annealing process consists of the cooling
Schedule) control, including the initial value T of the Control Parameter and its attenuation factor △t, the number of iterations for each tvalue L and the Stop Condition s.
Model of Simulated Annealing Algorithm
The simulated annealing algorithm can be divided into three parts: solution space, target function, and initial solution.
The basic idea of simulated annealing:
(1) initialization: the initial temperature T (sufficiently large), the initial solution state s (the starting point of the algorithm iteration), and the number of iterations of each t value L
(2) For k = 1 ,......, L perform steps (3) to 6th:
(3) generate a new solution s ′
(4) Calculate the incremental △t' = C (s')-C (s), where C (s) is the evaluation function.
(5) If △t' <0, it accepts s' as the new current solution. Otherwise, it accepts s' as the new current solution by probability exp (-△t'/T.
(6) If the termination condition is met, the current solution is output as the optimal solution and the program is ended.
The termination condition is usually set to terminate an algorithm when several consecutive new solutions are not accepted.
(7) T gradually decreases, and t-> 0, then go to step 1.
The algorithm corresponds to a dynamic presentation diagram:
The generation and acceptance of the new solution of the simulated annealing algorithm can be divided into the following four steps:
The first step is to generate a new solution located in the solution space from the current solution of a production function. To facilitate subsequent computation and acceptance, reduce the algorithm time consumption, generally, a method is used to generate a new solution after a simple transformation of the current new solution, for example, replacing or swapping all or part of the elements that constitute the new solution, note that the transformation method of the new solution determines the structure of the neighborhood of the current new solution, which affects the selection of the cooling schedule.
The second step is to calculate the target function difference corresponding to the new solution. Because the objective function difference is only generated by the transformation part, it is best to calculate the objective function difference in incremental mode. Facts show that for most applications, this is the fastest way to calculate the objective function difference.
The third step is to determine whether a new solution is accepted. The judgment is based on an acceptance criterion. The most common acceptance criterion is the metropo1is criterion: if △t' <0, then the s 'is accepted as the new current solution S. Otherwise, the probability exp (-△t'/T) is accepted as the new current solution S.
Step 4: when the new solution is determined to accept, replace the current solution with the new solution, which only needs to implement the transformation part corresponding to the new solution, modify the target function value. At this time, the current solution implements an iteration. Next round of tests can be started on this basis. When the new solution is determined to be discarded, the next round of test will continue on the basis of the current solution.
The simulated annealing algorithm has nothing to do with the initial value, and the solution obtained by the algorithm has nothing to do with the initial solution state s (the starting point of the algorithm iteration). The simulated annealing algorithm has the gradual convergence, theoretically, it has been proved that it is a global optimization algorithm that converges at the global optimal solution with probability L. The simulated annealing algorithm has parallelism.
Simple Application of Simulated Annealing Algorithm
As an application of the simulated annealing algorithm, we discuss the traveling salesman problem (TSP): It has n cities and uses digital 1 ,..., N stands. The distance between city I and city J is d (I, j) I, j = 1 ,..., N. The TSP problem is to find a loop that is exactly once visited every city, and its path length is the shortest ..
The simulated annealing algorithm model for solving TSP can be described as follows:
Solution Space solution space s is all the loops that occur exactly once in every city, which is {1 ,......, A set of all loops of n}. The members in S are marked as (W1, W2 ,......, And remember Wn + 1 = W1. The initial solution can be (1 ,......, N)
The target function at this time is the total length of the path to access all cities or the cost function:
We require the minimum value of this cost function.
The generation of the new solution randomly generates two different K and M values between 1 and N. If K is used
(W1, W2 ,..., Wk, wk + 1 ,..., WM ,..., Wn)
Changed:
(W1, W2 ,..., WM, wm-1 ,..., WK + 1, wk ,..., Wn ).
If K> m
(W1, W2 ,..., Wk, wk + 1 ,..., WM ,..., Wn)
Changed:
(WM, wm-1 ,..., W1, WM + 1 ,..., Wk-1, Wn, wn-1 ,..., WK ).
The above transformation method can be simply described as "reversing the middle or reversing the two ends ".
You can also use other transformation methods. Some transformations have unique advantages, and sometimes they are used in turn to obtain a better method.
The cost function is set to (W1, W2 ,......, Wn) to (U1, U2 ,......, UN), the cost function difference is:
Based on the above analysis, you can write a pseudo program that uses the simulated annealing algorithm to solve the TSP problem:
Procedure tspsa:
Begin
Init-of-T; {T is the initial temperature}
S = {1 ,......, N };{ S is the initial value}
Termination = false;
While termination = false
Begin
For I = 1 to l do
Begin
Generate (s' Form S); {generate new circuit s' from current circuit s ′}
△T: = f (s ')-f (s); {f (s) is the total path length}
If (△t <0) or (exp (-△t/T)> random-of-[0, 1])
S = s ′;
If the-halt-condition-is-true then
Termination = true;
End;
T_lower;
End;
End
The simulated annealing algorithm is widely used and can solve the Max cut problem, Zero One knapsack problem, and graph coloring problems with high efficiency) scheduling problem.
Parameter Control of simulated annealing Algorithms
The simulated annealing algorithm is widely used and can solve the NP-complete problem, but its parameters are difficult to control. The main problems are as follows:
(1) The Initial Value Setting of temperature T.
The initial value setting of temperature T is one of the important factors affecting the global search performance of the simulated annealing algorithm. When the initial temperature is high, it is more likely to find the global optimal solution, but it takes a lot of time to calculate it; however, global search performance may be affected. In practice, the initial temperature usually needs to be adjusted several times based on the experiment results.
(2) The annealing speed problem.
The global search performance of the simulated annealing algorithm is also closely related to the annealing speed. In general, it is quite necessary to search (annealing) at the same temperature, but this requires computing time. In practical application, we need to set reasonable annealing conditions for the nature and features of specific problems.
(3) temperature management problems.
The temperature management problem is also one of the problems that the simulated annealing algorithm is difficult to solve. In practical application, the following cooling method is often used because the practical feasibility of computing complexity must be considered:
T (t + 1) = K × T (t)
In formula, K is a constant slightly less than 1.00, and T is the number of times the temperature is lowered.
Main Idea of Simulated Annealing Algorithm
For the minimum value of a function, the main idea of simulated annealing is: Random Walk (Random Choice Point) between search zones (two-dimensional plane), and then use the Metropolis sampling criterion, the Random Walk gradually converges to the local optimal solution. Temperature is an important control parameter in the Metropolis algorithm. It can be considered that the size of this parameter controls the speed at any time when the process moves to the local or global optimal solution.
The cooling parameter table, domain structure, and new solution generator, acceptance criteria, and random number generator (I .e. the Metropolis algorithm) constitute three pillars of the algorithm.
L key sampling and metroplis algorithms:
Metropolis is an effective key sampling method. Its algorithm is as follows: when the system changes from one energy state to another, the corresponding energy changes from E1 to E2, the probability is P = exp [-(E2-E1)/Kt]. If E2 <E1, The system receives this status. Otherwise, it receives this status with a random probability or discards this status. This type of iteration is often performed for a certain number of times, and the system will gradually become a stable distribution.
When sampling with a focus, if the new State is downward, it will be accepted (local optimum). If it is upward (global search), it will be accepted with a certain probability. The simulated annealing method starts from an initial solution. After a large number of transformations, the relative optimal solution of the combined optimization problem can be obtained when the control parameter value is given. Then, reduce the value of the control parameter T and execute the Metropolis algorithm repeatedly to obtain the overall optimal solution of the combined optimization problem when the control parameter t tends to be zero. The value of the control parameter must fade slowly.
Temperature is an important control parameter of Metropolis. Simulated Annealing can be regarded as an iteration of the metroplis algorithm when the decreasing control parameter is used. At the beginning, the tvalue is large and may accept a poor deterioration solution. As T decreases, it can only accept a better deterioration solution. At last, when T tends to 0, it will not accept any deterioration solution.
When the temperature is infinitely high, the system immediately distributes evenly and accepts all the proposed transformations. The smaller the T attenuation, the longer the T arrives at the end. However, the smaller the Markov chain, the shorter the time it takes to reach the quasi-equilibrium distribution,
L parameter selection:
We call it a series of important parameters for adjusting the simulated annealing method as the cooling schedule. It controls the initial value of parameter T and its attenuation function, corresponding to the Markov chain length and stop conditions, and is very important.
A cooling schedule should specify the following parameters:
1. Initial Value t0 of control parameter T;
2. the attenuation function that controls parameter T;
3. The length of the Markov Chain LK. (That is, the number of iterations of each random walk process tends to a quasi-balanced distribution, that is, a local convergence solution location)
4. Selection of end conditions
Effective cooling schedule criteria:
1. algorithm convergence: mainly depends on the length of the attenuation function and the Markov chain and the selection of the Stop criterion.
Ii. algorithm experiment performance: Final Solution quality and CPU time
Parameter Selection:
1) Select the initial value t0 of the Control Parameter
Generally, the T0 value must be sufficiently large, that is, the initial value is in the high temperature state, and the receiving rate of Metropolis is about 1.
2) Selection of attenuation Functions
The attenuation function is used to control the annealing speed of temperature. A common function is T (n + 1) = K * t (N), where K is a constant very close to 1.
3) selecting the length of the Markov chain l
The principle is: if the attenuation function of the attenuation parameter T has been selected, the quasi-equilibrium can be restored for each value of the control parameter.
Iv) Conditions for termination
There are many options for termination conditions. Different conditions have a great impact on the performance and quality of algorithms. This article only introduces one common termination condition. That is, the difference between the previous optimal solution and the latest optimal solution is less than a certain tolerance, and the iteration of the Markov chain can be stopped.
Evaluate the minimum value of function f (x, y) = 5sin (xy) + X2 + y2 Using Simulated Annealing
Solution: Based on the question, the cooling table schedule is as follows:
That is, the initial temperature is 100
The attenuation parameter is 0.95.
The length of the Markov chain is 10000.
The step of Metropolis is 0.02
The ending condition is that the difference between the previous optimal solution and the latest optimal solution is less than a certain tolerance.
Salesman' Problems
Time Limit: 1000 ms memory limit: 65536 K
Total submit: 153 accepted: 10
Description
There are n villages in a village (1 <n <40) and a salesclerk. He wants to sell goods in each village. The distance between villages is S (0 <S <10000) it is known, and the road from village a to village B is much different from that from village B to village. to improve efficiency, he starts from the store to every village once, and then returns to the village where the store is located, assuming that the village where the store is located is 1, he does not know what route he chooses to make the journey shortest. please help him select the shortest path.
Input
The distance between village N and each village (all are integers ).
Output
The shortest path.
Sample Input
30 2 11 0 22 1 0
Sample output
3
Code:
/* Simulated annealing * // * AC code: 0 Ms */# include <iostream> # include <ctime> # include <cstdlib> using namespace STD; const int max = 41; const int Rann = 1000; const int runn = 50; const int INF = 99999999; int map [Max] [Max], rpath [Rann] [Max], Min [Rann], n; void adjust (int x [], int RN) // RN is the adjusted count {int A, B; while (RN --) {A = rand () % (N-1) + 1; // adjusted position B = rand () % (N-1) + 1; swap (X [a], X [B]) ;}} void get_map () {int I, j; for (I = 1; I <= N; I ++) for (j = 1; j <= N; j ++) scanf ("% d", & Map [I] [J]);} void get_rpath () // generates the initial sequence of Rann groups {int I, j; for (I = 0; I <Rann; I ++) {for (j = 1; j <= N-1; j ++) rpath [I] [J] = J + 1; adjust (rpath [I], n-1); // initial adjustment Min [I] = inf ;}} void swap (Int & X, Int & Y) // swap X, Y {int T = X; X = y; y = T;} int get_dis (int x []) // return path length {int sum = 0, I, P = 1; for (I = 1; I <= N-1; I ++) {sum + = map [p] [x [I]; P = x [I];} sum + = map [p] [1]; return sum;} void numcpy (int x [], int y []) {for (INT I = 1; I <= N-1; I ++) x [I] = Y [I];} void get_ans () {int I, j, T = N-1, temp, P [41]; while (t --) // The adjusted range decreases by {for (I = 0; I <Rann; I ++) // traverse Rann group data {for (j = 0; j <runn; j ++) // perform runn times for each group {numcpy (p, rpath [I]); // because every time adjust (P, T) is changed on the original rpath [I]; temp = get_dis (p); If (temp <min [I]) {numcpy (rpath [I], p); // update rpath [I]; Min [I] = temp ;}}} int ans = min [0]; for (I = 1; I <Rann; I ++) if (Min [I] <ans) ans = min [I]; printf ("% d/N ", ans);} int main () {srand (time (0); // a random number is generated using the time-based Random Seed while (scanf ("% d", & N )! = EOF) {get_map (); get_rpath (); get_ans ();} return 0 ;}