Particle Swarm Optimization Algorithm

Source: Internet
Author: User
Introduction: Basic PSO algorithm implemented in C Language
C language implementation of the standard PSO algorithm 1. first look at the language description of the PSO to summarize the core idea of particle swarm optimization (PSO) in one sentence: Cool people often appear in pairs.. Therefore, if you want to make yourself more bullish, You have to lean to the ox and hug your thigh. The initial idea of PSO is that birds and fish are waiting for food. It is based on the following assumptions: food must exist around birds for food, so we 'd better look around them. For example, there is a group of birds standing on the pole and one flying down to eat rice on the road. It is the same reason that other birds will fly around the bird for food after seeing it. Particle swarm is the simplest heuristic optimization algorithm. To understand it, you only need to add or subtract the number multiplication of a simple mathematical vector. Its core is two formulas: but it seems that there are still many parameters, α and β are constants, while ε 1 and ε 2 are random number vectors between 0 and 1. ⊙ is the point multiplication, that is Multiplication and accumulation of two vectors. G * is the global "best" solution. X * is the "best" solution of personal history. So take the two "best" and subtract X, that is, "learning from the Ox". In mathematics, the expression is "adding the vector pointing to the optimal solution to your own body". is it very similar to "Integrating the knowledge learned from the ox into their own knowledge system ?! V is an intermediate variable, which can be eliminated in mathematics. 2. after reading the description, let's take a look at the description of the code form. Here, if we want to improve the PSO, the easiest thing to think of is to improve this update formula. Of course there are other things, because a search by Google scholar has made more than 30 thousand references to the launch of PSO. 3. After reading the description in the near-code form, let's look at the implementation of the real code: pso_simpledemo.m
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
% The Particle Swarm Optimization
% (Written by x s Yang, Cambridge University)
% Usage: PSO (number_of_participant, num_iterations)
% Eg: Best = pso_demo (20, 10 );
% Where best = [xbest ybest zbest] % an N by 3 matrix
% Xbest (I)/ybest (I) are the best at ith Iteration
Function [best] = pso_simpledemo (n, num_iterations)
% N = number of particles
% Num_iterations = Total number of iterations
If nargin <2, num_iterations = 10; End
If nargin <1, n = 20; End
% Michaelewicz function f * =-1.801 at [2.20319, 1.57049]
% Splitting two parts to avoid a long line for Printing
Str1 = '-sin (x) * (sin (x ^ 2/3. 14159) ^ 20 ';
Str2 = '-sin (y) * (sin (2 * y ^ 2/3. 14159) ^ 20 ';
Funstr = strcat (str1, str2 );
% Converting to an inline function and Vectorization
F = vectorize (Inline (funstr ));
% Range = [xmin xmax ymin Ymax];
Range = [0 4 0 4];

% ----------------------------------------------------
% Setting the parameters: alpha, beta
% Random amplof of roaming participant alpha = [0, 1]
% Alpha = gamma ^ t = 0.7 ^ t;
% Speed of convergence (0-> 1) = (slow-> fast)
Beta = 0.5;
% ----------------------------------------------------
% Grid values of the objective function
% These values are used for visualization only
Ngrid = 100;
DX = (range (2)-range (1)/ngrid;
DY = (range (4)-range (3)/ngrid;
Xgrid = range (1): DX: range (2 );
Ygrid = range (3): Dy: range (4 );
[X, y] = meshgrid (xgrid, ygrid );
Z = f (x, y );
% Display the shape of the function to be optimized
Figure (1 );
Surfc (x, y, z );
% ---------------------------------------------------
% Best = zeros (num_iterations, 3 );
% Initialize history
% ----- Start Particle Swarm Optimization -----------
% Generating the initial locations of n particles
[XN, yn] = init_pso (n, range );
% Display the paths of particles in a figure
% With a contour of the objective function
Figure (2 );
% Start iterations
For I = 1: num_iterations,
% Show the contour of the Function
Contour (X, Y, Z, 15); Hold on;
% Find the current best location (XO, yo)
Zn = f (Xn, yn );
Zn_min = min (Zn );
XO = min (Xn (zn = zn_min ));
Yo = min (yn (zn = zn_min ));
Zo = min (Zn (zn = zn_min ));
% Trace the paths of all roaming participant
% Display these roaming participant
Plot (Xn, YN, '.', XO, yo, '*'); axis (range );
% The accelerated PSO with alpha = gamma ^ t
Gamma = 0.7; alpha = Gamma. ^ I;
% Move all the participants to new locations
[XN, yn] = pso_move (Xn, YN, XO, yo, Alpha, beta, range );
Drawnow;
% Use "Hold on" to display paths of Particles
Hold off;
% History
Best (I, 1) = XO; best (I, 2) = yo; best (I, 3) = Zo;
End % end of iterations
% ----- All subfunctions are listed here -----
% Intial locations of n particles
Function [XN, yn] = init_pso (n, range)
Xrange = range (2)-range (1 );
Yrange = range (4)-range (3 );
Xn = rand (1, N) * xrange + range (1 );
YN = rand (1, N) * yrange + range (3 );
% Move all the participating toward (XO, yo)
Function [XN, yn] = pso_move (Xn, YN, XO, yo, A, B, range)
Nn = size (YN, 2); % A = Alpha, B = Beta
Xn = xn. * (1-B) + XO. * B + A. * (RAND (1, NN)-0.5 );
YN = YN. * (1-B) + Yo. * B + A. * (RAND (1, NN)-0.5 );
[XN, yn] = findrange (Xn, YN, range );
% Make sure the participates are within the range
Function [XN, yn] = findrange (Xn, YN, range)
Nn = length (yn );
For I = 1: NN,
If Xn (I) <= range (1), xn (I) = range (1); End
If Xn (I)> = range (2), xn (I) = range (2); End
If yn (I) <= range (3), yn (I) = range (3); End
If yn (I)> = range (4), yn (I) = range (4); End
End

Experiment results: Reference: Introduction to mathematical optimization (Xin-She Yang)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.