Simulated annealing algorithm
Inspired by the solid annealing process, Kirkpatrick and others are aware of the similarity between combinatorial optimization problem and solid annealing process, the combinatorial optimization problem is analogous to the annealing process of solid, and a simulated annealing algorithm is proposed for solving combinatorial optimization problem.
Table 7.3 shows the analogy between the combinatorial optimization problem and the solid annealing process.
Table 7.3: Analogy between combinatorial optimization problems and annealing processes
Solid annealing Process |
Combinatorial optimization problems |
A state in a physical system |
Solution of Combinatorial optimization problem |
The energy of the state |
The indicator function of the solution |
Lowest energy state |
Optimal solution |
Temperature |
Control parameters |
To set up a combinatorial optimization problem which is defined on the finite set S, I∈s is a solution to the problem, and F (i) is the indicator function of the solution I. The analogy given in table 7.3, I corresponds to a state of the physical system, F (i) to the state of the energy E (i), a process for controlling the algorithm, its value with the algorithm process to decrement the control parameter T corresponds to the temperature T in the solid annealing, the thermal motion of the particle is replaced by the solution in the neighborhood of the exchange. This establishes a relationship between a combinatorial optimization problem and a solid annealing process.
When solving combinatorial optimization problems, a large T-value is given first, which is equivalent to a high temperature T. The solution I of a random problem is given as the initial solution of the problem. At a given t, a problem is randomly generated by the solution j,j∈n (i), where N (i) is the neighborhood of I. The transfer probability from solution I to the new solution J is determined according to the Metropolis Criterion, namely:
If the new Solution J is accepted, the solution J is substituted for the solution I, otherwise the solution I remains. Repeat the process until the balance is reached under the control parameter T. In the annealing process, the temperature T slow down corresponds to, after sufficient state transfer, the control parameter T to slow down, and under each parameter T, repeat the process until the control parameter T is reduced to a small enough. Finally, we get an optimal solution to this combinatorial optimization problem. Because such a process simulates the annealing process, it is called a simulated annealing algorithm.
Below, we give a description of the simulated annealing algorithm.
The algorithm has two layers of internal and external loops. The internal loop simulates the process of the system achieving thermal equilibrium at a given temperature. Each cycle randomly produces a new solution and then randomly accepts the solution according to the Metropolis Criterion. Random (0, 1) in the algorithm is a randomly distributed number generator in [0, 1], which, combined with the transfer probability from solution I to the bad solution J, simulates whether the system accepts the inferior solution J. The outer loop simulates the temperature drop process, and the control parameter TK acts in a similar way to the temperature T, indicating the temperature at which the system is located at the K-cycle. The drop (TK) in the algorithm is a temperature drop function that implements a slow descent of temperature according to certain principles.
The simulated annealing algorithm is similar to the local search algorithm, the biggest difference is that the simulated annealing algorithm randomly accepts some inferior solution according to the Metropolis Criterion, that is, the solution of the large value of the indicator function. When the temperature is high, the probability of accepting the inferior solution is relatively large, at the initial high temperature, almost 100% of the probability to accept the inferior solution. As the temperature decreases, the probability of receiving the inferior solution decreases gradually until the temperature tends to 0 o'clock, and the probability of receiving the inferior solution tends to be 0 at the same time. This will help the algorithm to jump out of the local optimal solution and obtain the global optimal solution of the problem.
The above simulated annealing algorithm only gives an algorithm framework, of which three important conditions: the initial temperature selection, the end condition of the inner loop and the end condition of the outer loop are not mentioned in the algorithm, which is the key of the simulated annealing algorithm. Because, as previously described, for a solid annealing process, in order to eventually make the physical system in the probability of 1 in the lowest energy state, in the annealing process must meet the following three conditions:
(1) The initial temperature must be high enough;
(2) At each temperature, the exchange of the State must be sufficient;
(3) The drop in temperature T must be slow enough.
These three conditions are just the opposite of the three important conditions not mentioned in the algorithm. As with the solid annealing process, in order to make the simulated annealing algorithm solve the problem by probability 1, at least the three conditions must be satisfied. However, "the initial temperature must be high enough, the state exchange must be sufficient, the temperature must be slow enough to decline" the condition is inconsistent with our attempt to solve the combinatorial optimization problem of the low complexity algorithm of the original intention. If the simulated annealing algorithm is still an exponential complexity algorithm, it will not give any meaning to solve the complex combinatorial optimization problem. The question now is how to weaken some conditions so that we can obtain a satisfactory solution to a combinatorial optimization problem within a polynomial time complexity. In the next section we will discuss these issues and give some basic methods for determining the initial temperature as well as the internal and external cycle end conditions.
Determination of parameters
From the above analysis we know that the simulated annealing algorithm with probability 1 to find the global optimal solution of the basic conditions, is that the initial temperature must be high enough, at each temperature the state of the exchange must be sufficient, the temperature T decline must be slow enough. Therefore, the initial temperature t0, the number of switching times of the state at each temperature, the descent method of the temperature T, and the temperature drop to what extent the end of the algorithm and other parameters to determine, become a simulated annealing algorithm must be considered when solving the problem. Because theoretically, the ability of the simulated annealing algorithm to achieve the optimal solution is based on the infinite state transfer of the search process, and as an optimization algorithm, the time complexity of the algorithm is still exponential time and cannot be used for large-scale combinatorial optimization problem solving. But for many practical problems, as we have already discussed in the first section, the meaning of solving the optimal solution of the problem is not big, and a satisfactory solution is sufficient. And whether we can get the satisfactory solution of the problem in a polynomial time is the main problem that we are concerned about.
Not any set of parameters, can ensure that the simulated annealing algorithm converges to an approximate solution, a large number of experiments show that the quality of the solution is proportional to the running time of the algorithm, it is difficult to achieve the best of both worlds. In this paper, we give some methods to determine the parameters or criteria of the simulated annealing algorithm, and try to make a compromise between the solution time and the quality of the solution. These parameters or guidelines include:
(1) initial temperature T0;
(2) The attenuation function of temperature T, i.e. the method of decreasing temperature;
(3) The termination criterion of the algorithm is given by the terminating temperature tf or termination condition;
(4) The Markov chain length under each temperature T lk.
It is also possible to obtain a suitable initial temperature in the process of heating up the solid by the method of gradual heating. Here's how:
(1) Given a desired initial acceptance probability P0, given a lower initial temperature t0, such as t0=1;
(2) A sequence of states is randomly generated and the receiving rate of the sequence is calculated:
If the receiving rate is greater than the given initial acceptance probability P0, then turn (4);
(3) Increase the temperature, update the T0, turn (2);
(4) End.
Where the update t0 can be doubled each time:
T0=2xt0
You can also use the method of fixing values for each reinforcement:
T0=t0+t
The t here is a given constant in advance.
2. The method of decreasing temperature
The annealing process requires that the temperature drop is slow enough and the usual temperature drop method has the following three types:
(1) The proportion decreases.
This method decreases the temperature at the same rate each time by setting a attenuation factor:
, k=0,1,....。 (7.36)
Where TK is the current temperature, tk+1 is the temperature of the next moment and is a constant. The closer it is to 1, the slower the temperature drops, the more commonly you can choose a value around 0.8~0.95. The method is simple and practical, and it is a common method of temperature descent.
(2) Equivalent drop
The method of each temperature drop is a fixed value:
(7.37)
Set K is the total number of expected temperature drops, then:
(7.38)
Where t0 is the initial temperature.
The advantage of this method is that the total temperature drop can be controlled, but because each temperature drop is a fixed value, if set too small, the temperature drops too slowly at high temperature, if the setting is large, the temperature drops at low temperatures too fast.
3. Stop criteria at each temperature
At each temperature, the simulated annealing algorithm requires sufficient state switching, and if LK represents the number of iterations under temperature TK, then LK should make the Markov chain at this temperature basically steady.
There are several common stopping criteria:
(1) Fixed length method
This is the simplest way to use the same LK at every temperature. The choice of LK is related to the specific problem, which is usually directly associated with the size of the neighborhood and is typically chosen as a polynomial function of the problem size n. For example, in the case of the travel quotient in n cities, if the neighborhood is generated by the method of exchanging two cities, then LK can choose such as CN, CN2, etc., where C is constant.
(2) Stop criteria based on acceptance rate
From the previous analysis of the annealing process, we know that at higher temperatures, the probability of the system in each state is basically the same, and the acceptance probability of each state is close to 1. So at high temperatures, even small iterations can basically reach a steady state. As the temperature decreases, the number of rejected states increases, so that at low temperatures, the iterated algebra should be increased to avoid premature fall into local optimal state due to too few iterations. So an intuitive idea is to increase the number of iterations appropriately as the temperature drops.
One way is to specify an acceptance of r, at a certain temperature, only the number of accepted states reached R, the iteration at that temperature stopped, into the next temperature. As the temperature decreases, the probability of the state being accepted decreases, so a criterion is to meet the appropriate increase in the number of iterations as the temperature drops. However, because the acceptance probability is very low when the temperature is low, in order to prevent an excessive number of iterations, the upper limit of the number of iterations is generally set, and when the number of iterations reaches the upper limit, the iterative process of the temperature is stopped even if the number of times of the iteration is not met.
Similar to the previous method, you can specify a state acceptance rate of r,r equal to the number of States accepted at that temperature divided by the total number of States generated. If the acceptance rate reaches R, the iteration at that temperature is stopped and the next temperature is transferred. To prevent too little or too many iterations, it is common to define the lower and upper bounds of an iteration, and stop the iteration of this temperature only if the number of iterations reaches the lower limit and satisfies the required acceptance rate r, or when the maximum number of iterations is reached.
You can also define a stop criterion by introducing the concept of "generation". In the process of iteration, several neighboring states are called "generation", and if the difference of the index function of the solution of two neighboring generations is less than the specified value, then the iteration at that temperature is stopped.
The number of iterations at a certain temperature is closely correlated with the drop in temperature. If the temperature drop is smaller, the stationary distribution between the adjacent two temperatures should also be smaller. Some studies have shown that an excessive number of iterations has little to do with improving the quality of the solution, which only leads to an increase in the computational time of the system. Therefore, the general selection of relatively small temperature attenuation value, as long as the number of iterations appropriate large can be.
4. Termination principle of the algorithm
The simulated annealing algorithm starts to descend gradually from the initial temperature t0, finally, when the temperature drops to a certain value, the algorithm ends. The reasonable end condition should make the algorithm converge to an approximate solution of the problem, and it should be able to ensure that the solution has a certain quality, and that the algorithm should be stopped in an acceptable finite time period.
Generally there are several methods to determine the termination of the algorithm.
(1) Zero-degree method
Theoretically, the simulated annealing algorithm ended when the temperature approached 0 o'clock. Therefore, a normal number can be set, at which point the algorithm ends.
(2) Total Cycle control method
Given a specified number of temperature drops K, the algorithm stops when the number of iterations of the temperature reaches K times. This requires a suitable k to be given. If the K-value selection is not appropriate, for small-scale problems will increase the algorithm unnecessary running time, and for large-scale problems, it may be difficult to obtain high-quality solutions.
(3) No change control method
As the temperature drops, although the simulated annealing algorithm will randomly accept some bad solution, but in general, the quality of the resulting solution should be gradually improved, at low temperature, especially. If there is no change in the value of the index function of the solution obtained in the neighboring n temperature, the algorithm has been convergent. Even if the local optimal solution is convergent, the algorithm can be terminated because the probability of jumping out of local optimal solution at low temperature is very small.
Above content sources
Ma Shaoping, XiaoYan Zhu, Ai, Tsinghua University Press
Job Related:
The code is written in reference to the pseudo-code described above, and it is still recommended to debug on the command line as follows:
SA.h
#ifndef sa_h#define sa_h#include <iostream> #include <fstream> #include <vector> #include < Stdlib.h> #include <time.h>using namespace Std;const double init_t=300; Initial temperature const double rate=0.92; temperature decay rate const int in_loop=2000; Number of internal loops const double final_t=0.001; Outer cyclic temperature termination condition const int limit=1000; Probability selection upper bound typedef struct{char name;double x;double y;} _city;class Sa{private:int Citynum; Read City number _city city;vector<_city> path; Equilibrium state at each temperature double Pointdist[20][20];char bestpath[20];p ublic:void input (const string &);d ouble calpointdist (char , char);d ouble callength (vector<_city>);vector<_city> getnext (vector<_city>); void Solve (const String &); Output Path SA () {srand ((int) time (0));//Initialize random number}}; #endif
SA.cpp
#include "SA.h" #include <string>void sa::input (const string &filename) {Ifstream Fin;fin.open (filename.c_ STR ()); if (!fin) {//Determine if the file is open cout<< "Unable to open the file!" <<endl;exit (1);} String Line;int i=0,j=0;getline (fin,line); Citynum=atoi (Line.c_str ()); Reads the city number while (Getline (fin,line)) {//lines reads the file in the input stream until the input stream ends City.name=line[0];string::size_type pos1, pos2;pos1 = line. Find_first_of (' \ t ');//Find line first ' \ t ' Pos2 = line.find_last_of (' \ t '); Find line last ' \ t ' city.x = atof (Line.substr (pos1+1,6). C_STR ()); city.y = Atof (Line.substr (pos2+1,6). C_STR ()); Path.push_back (city);} Double l;for (int i=0;i<citynum;i++)//using two-dimensional arrays to store the distance between cities for (int j=i+1;j<citynum;j++) {l= (path[j].x-path[i].x) * ( path[j].x-path[i].x) l+= (PATH[J].Y-PATH[I].Y) * (PATH[J].Y-PATH[I].Y);p ointdist[i][j]=sqrt (l);p ointdist[j][i]= sqrt (l);}} Double sa::calpointdist (char C1,char c2) {//returns the distance between two cities return pointdist[c1-65][c2-65];} Vector<_city> Sa::getnext (vector<_city> curpath) {int i,j; do {i= (int) (CitynuM*rand ()/(rand_max+1.0)); Generate [0,citynum-1]; j= (int) (Citynum*rand ()/(rand_max+1.0)); } while (! ( I<J)); Generate two different random numbers while (I<J) {swap (curpath[i++], curpath[j--]); Generate new solution by reverse order method} return Curpath;} Double Sa::callength (vector<_city> curpath) {//Calculate current path distance double totallength=0;for (int i=0;i<citynum-1;i++) Totallength+=calpointdist (Curpath[i].name,curpath[i+1].name); Totallength+=calpointdist (curPath[CityNum-1].name , curpath[0].name); return totallength;} void Sa::solve (const string &filename) {vector<_city> curpath;vector<_city> newpath;double CurLen, Newlen;curpath=path;curlen=callength (Curpath);d ouble delta;double p;double T=init_t;while (True) {for (int i=0;i< in_loop;i++) {//fixed steps end, that is to achieve the temperature balance condition Newpath=getnext (curpath); Newlen=callength (NewPath);d Elta=newlen-curlen; Judge the indicator function increase or decrease if (delta<0) {//Update length curpath=newpath;curlen=callength (Curpath);} Else{p = (double) (1.0*rand ()/(rand_max+1.0)); Accept the differential solution with a certain probability double ee=exp (DELTA/T); if (exp (-DELTA/T) > P) {curpath = NEWPATh;curlen=callength (Curpath); }}}path=curpath;for (int i=0;i<citynum;i++) {cout<<curpath[i].name<< "";} Cout<<curlen<<endl;static ofstream Fout (Filename.c_str ()); if (!fout) {cout<< "error!" <<endl;exit (1);} for (int i=0;i<citynum;i++) fout<<path[i].name<< ""; Write output file Fout<<callength (path) <<endl;if (t<final_t) break one by one; T=t*rate;}}
Main.cpp
#include "SA.h" void main (int argc,char *argv[]) {SA tsp;//tsp.input ("G:\\homework3\\tsp20.txt"); Tsp.input (argv[1]); /tsp.solve ("G:\\homework3\\o.txt"); Tsp.solve (argv[2]); GetChar ();}
Artificial intelligence Operation homework3--simulated annealing to solve TSP