One sentence on Simulated Annealing

Source: Internet
Author: User
1. First, let's look at the language description of SA to summarize the core idea of simulated annealing (SA: Appropriate acceptance of poor solutions. Why" Appropriate"Accept bad solutions? In the process of iteration, a new solution is generated for each iteration, and a new fitness value can be obtained for the new solution, if the new fitness value is smaller than the original fitness value (taking the minimum value as an example), a good solution will be accepted at this time. However, there are also a lot of bad solutions. If a bad solution is fully accepted every time, it is "completely random", and its convergence speed is slow; so it is not good to completely accept the bad solution. The result is that it is easy to fall into a local minimization point, so sa is not a "global optimization algorithm", so we need" Appropriate"Accept bad solutions to balance the two extremes. What kind of receiving rules are "appropriate"? Take the initial version as an example. P = exp (-delta F/kt)> R, R is a random number between 0 and 1, and K is the Boltzmann constant; delta F is the difference between the fitness value of the first and second generations. This random reception looks good. It should be noted that here t is exponential to the number of iterations: T-> α T, α (0, 1 ). Here are some constants (K, α) and initial values (t) that need to be determined, but these are all related to specific problems. Let's take a look at how the following examples are selected, the initial values of these parameters have a great impact on the optimization problem. 2. after reading the description, let's take a look at the description in the Code Format: after reading the above pseudo code: if we want to improve SA, we can start from the following aspects: 1, how to move randomly to new locations2, P generation strategy, that is, how to change the "appropriate" Policy 3, temperature T drop strategy, that is, Alpha change 3. after reading the description in the form of code, let's look at the implementation of the real code: sa_simpledemo.m
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
% Find the minimum of a function by Simulated Annealing
% Programmed by x s Yang (Cambridge University)
% Usage: sa_simpledemo
% Show lab information % %
Disp ('simulating... it will take a minute or so! ');
% Fitness function and its graphics % %
% Rosenbrock test function with f * = 0 at (1, 1)
Fstr = '(3o) ^ 2 + 100 * (Y-x ^ 2) ^ 2 ';
% Convert into an inline function
F = vectorize (Inline (fstr ));
% Show the topography of the objective function
Range = [-2 2-2 2];
Xgrid = range (1): 0.1: range (2 );
Ygrid = range (3): 0.1: range (4 );
[X, y] = meshgrid (xgrid, ygrid );
Surfc (X, Y, f (x, y ));

% Initialize SA parameter % %
% Initializing parameters and settings
T_init = 1.0; % initial temperature
T_min = 1e-10; % finial stopping Temperature
% (Eg., t_min = 1e-10)
F_min =-1E + 100; % min value of the Function
Max_rej = 500; % maximum number of rejections
Max_execute = 100; % maximum number of runs
Max_accept = 15; % maximum number of accept
K = 1; % Boltzmann constant
Alpha = 0.9; % cooling factor
Enorm = 1e-5; % energy norm (eg, enorm = 1e-8)
Guess = [2 2]; % initial guess
% Initializing the counters I, j etc
I = 0; j = 0;
Accept = 0; totaleval = 0;
% Initializing Various values
T = t_init;
E_init = f (guess (1), guess (2 ));
E_old = e_init; e_new = e_old;
Best = guess; % initially guessed values
% SA iteration process % %
% Starting the simulated annealling
While (T> t_min) & (j <= max_rej) & e_new> f_min)
I = I + 1;
% Check if Max numbers of run/accept are met
If (I> = max_run) | (accept> = max_accept)
% Cooling according to a cooling schedule
T = Alpha * t;
Totaleval = totaleval + I;
% Reset the counters
I = 1; accept = 1;
End
% Function evaluations at new locations
NS = guess + rand (1, 2) * randn;
E_new = f (NS (1), NS (2 ));
% Decide to accept the new solution
DeltaE = E_new-E_old;
% Accept if improved
If (-deltaE> enorm)
Best = ns; e_old = e_new;
Accept = accept + 1; j = 0;
End
% Accept with a small probability if not improved
If (deltaE <= enorm & exp (-deltaE/(K * t)> rand );
Best = ns; e_old = e_new;
Accept = accept + 1;
Else
J = J + 1;
End
% Update the estimated Optimal Solution
F_opt = e_old;
End
% Show experiment result % %
% Display the final results
Disp (strcat ('obj function: ', fstr ));
Disp (strcat ('evaluations: ', num2str (totaleval )));
Disp (strcat ('best location: ', num2str (BEST )));
Disp (strcat ('best estimate: ', num2str (f_opt )));

Experiment results: Reference: Introduction to mathematical optimization (Xin-She Yang)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.