Particle swarm optimization (PSO, particle swarm optimization) algorithm is a computational intelligence field, in addition to Ant colony algorithm, a swarm intelligence optimization algorithm outside the fish algorithm, the algorithm was first proposed by Kennedy and Eberhart in 1995, The algorithm is derived from the study of bird predation problem.
Example Analysis 1:
The maximum value of Y is obtained according to the PSO algorithm, where the x value interval is [ -5,5]
The MATLAB code is as follows:
%%I. Empty environment Clcclear all%%II. Plot The objective function graph X= -5:0.01:5; y= 2.1* (1-x+5*x.^3). *exp (-X.^2/2)-x.^2; Figureplot (x, y) hold On%%III. parameter initialization here The default inertia factor is 1C1= 1.49445; %The reference value of the acceleration constant taking Eberhart C2= 1.49445; %The reference value of the acceleration constant taking Eberhart Maxgen= 30; %number of evolutionary sizepop= 100; %Population size Vmax= 0.5; %speed Step Limit vmin=-0.5; %speed Step Limit popmax= 5; %upper limit of x Popmin=-5; %lower limit of x%%Iv. Generating initial particles and velocities fori = 1: Sizepop%randomly produces a population pop (i,:)= -10*rands (1) +5; % generated [ -5,5] The initial population V (i,:)= 0.5 * Rands (1); % initialization [ -0.5,0.5] Speed%using the FUN.M sub-function, the degree of fitness is calculated and stored in fitness (i) fitness (i)=Fun (Pop (i,:)); End%%v. Individual extremum and group extremum [Bestfitness, Bestindex]=Max (fitness); Zbest= Pop (bestindex,:); %Global Best Gbest= Pop; %Individual Best Fitnessgbest= Fitness; %The best fitness value of individual fitnesszbest= Bestfitness; %Global Best Fit value%%Vi. Iterative Optimization fori = 1: Maxgen forj = 1: Sizepop%Speed Update V (j,:)= V (J,:) + c1*rand* (gbest (J,:)-pop (J,:)) + c2*rand* (Zbest-Pop (J,:)); V (J,v (j,:)>vmax) =Vmax; V (J,v (j,:)<vmin) =Vmin; %Population Update Pop (j,:)= Pop (j,:) +V (j,:); Pop (J,pop (j,:)>popmax) =Popmax; Pop (J,pop (j,:)<popmin) =popmin; %Fitness Value Update Fitness (j)=Fun (Pop (j,:)); End forj = 1: Sizepop%Individual Best UpdatesifFitness (j) >Fitnessgbest (j) Gbest (J,:)=Pop (j,:); Fitnessgbest (j)=Fitness (j); End%Group Optimal UpdateifFitness (j) >fitnesszbest zbest=Pop (j,:); Fitnesszbest=Fitness (j); End End YY (i)=fitnesszbest; End%%vii. output and plotting [fitnesszbest zbest];p lot (zbest, Fitnesszbest,'R.','markersize', 10) plot (zbest, Fitnesszbest,'ro','markersize', 16) X_text=['x=', Num2str (zbest)]; %x-axis conversion to string Y_text=['y=', Num2str (fitnesszbest)]; %y horizontal shift to string Max_text=char ('Global Best', X_text,y_text); %the string text that generates the flag maximum point (zbest+0.3, Fitnesszbest-1.4,max_text)%The graph shows the globally optimal data value Figureplot (yy) title ('Optimal Individual Fitness','fontsize', 12); Xlabel ('Evolutionary algebra','fontsize', (Ylabel);'degree of adaptability','fontsize', 12);
main.m
Function y = fun (x)% functions are used to calculate particle fitness value %x input %y output = 2.1* (1-x+ 5*x.^3). *exp (-X.^2/2)-x.^2;
fun.m
[MATLAB] 3. Particle swarm optimization algorithm