Application of particle swarm optimization algorithm in Parameter Estimation of Complex Functions

Source: Internet
Author: User
Tags vmin
Full text: http://www.lingch.net/db/download.asp? Tab = softdown & fild = 9 & id = 30 Abstract:Particle Swarm Optimization (PSO)AlgorithmIt is one of the most effective modern heuristic search algorithms. It features simple computing, fast convergence, and accurate convergence. This article applies it to Parameter Estimation of complex functions. Using the parameter estimation of complex functions based on the PSO algorithm mentioned in this article, we can accurately estimate the parameters of complex functions, it is easy to calculate, fast to converge, accurate to estimate, and highly applicable. Keywords: Particle swarm; Optimization; heuristic algorithm; PSO; Parameter Estimation Appendix Matlab source Program
% ------------------------------ Main program
% Application of particle swarm optimization algorithm in parameter Least Square Estimation
% Chen Lingling
% Post2Ling@hotmail.com

Points = 50; % sample points
VEC = 4; % dimension of the parameter vector to be estimated

Pop = 200; % Population Size
Vmax = 20; % maximum speed
Vmin =-20; % minimum speed
Niter = 150; % iterations
Wmax = 0.8; % maximum inertia factor
Wmin = 0.0; % minimum inertia factor
Wn = 1.0; % inertial factor attenuation Index
FaII = 0.3; % learning rate
Faig = 0.8; % social learning rate

% Input sample X Y
X = 51 + rand (50, 1) * 50;
Y = 100 + (200./(1 + exp (0. 3. * (X-75) + (RAND (0.5)-) * 20;

% Particle swarm
% Current position: VEC dimension, historical individuals, optimal position: VEC dimension, global optimal position: VEC dimension, current value, historical individuals, optimal Global Value
P = rand (POP, VEC * 3 + 3) * 300;
P (:, VEC + 1: VEC * 2) = P (:, 1: VEC );

% Update the adaptive Value
P (:, VEC * 3 + 1) = fitness (P (:, 1: VEC), x, y );
P (:, VEC * 3 + 2) = P (:, VEC * 3 + 1 );
P (:, VEC * 3 + 3) = P (1, VEC * 3 + 1 );
% Update the global optimal position and adaptive Value
For j = 1: Pop
If P (J, VEC * 3 + 2) <P (1, VEC * 3 + 3)
P (1, VEC * 3 + 3) = P (J, VEC * 3 + 2 );
P (1, VEC * 2 + 1: VEC * 3) = P (J, VEC + 1: VEC * 2 );
End
End
P (:, VEC * 2 + 1: VEC * 3) = ones (POP, 1) * P (1, VEC * 2 + 1: VEC * 3 );
P (:, VEC * 3 + 3) = P (1, VEC * 3 + 3 );

% Particle Group Speed
V = (RAND (POP, VEC)-0.5) * 50;

For I = 1: niter

% Update inertia factor
W = (niter-I) ^ wn)/(niter ^ wn) * (wmax-wmin) + wmin;

% Update speed
V = W. * V + faII. * (p (:, VEC + 1: VEC * 2)-P (:, 1: VEC) + faig. * (p (:, VEC * 2 + 1: VEC * 3)-P (:, 1: VEC ));
% Speed limit
V (find (V> Vmax) = vmax;
V (find (v <Vmin) = Vmin;

% Update location
P (:, 1: VEC) = P (:, 1: VEC) + V (:, 1: VEC );

% Update the adaptive Value
P (:, VEC * 3 + 1) = fitness (P (:, 1: VEC), x, y );

% Update the optimal position and adaptive value of an individual
For j = 1: Pop
If P (J, VEC * 3 + 1) <p (J, VEC * 3 + 2)
P (J, VEC + 1: VEC * 2) = P (J, 1: VEC );
P (J, VEC * 3 + 2) = P (J, VEC * 3 + 1 );
End
End

% Update the global optimal position and adaptive Value
For j = 1: Pop
If P (J, VEC * 3 + 2) <P (1, VEC * 3 + 3)
P (1, VEC * 3 + 3) = P (J, VEC * 3 + 2 );
P (1, VEC * 2 + 1: VEC * 3) = P (J, VEC + 1: VEC * 2 );
End
End
% Copy the redundant part for easy computing
P (:, VEC * 2 + 1: VEC * 3) = ones (POP, 1) * P (1, VEC * 2 + 1: VEC * 3 );
P (:, VEC * 3 + 3) = P (1, VEC * 3 + 3 );

End

% Output result
% [P (1, VEC * 2 + 1: VEC * 3), P (1, VEC * 3 + 3)]
[P (1, VEC * 2 + 1), P (1, VEC * 2 + 2), P (1, VEC * 2 + 3)/300, P (1, VEC * 2 + 4)]

Back Propagation = 51: 100;
HR = P (1, VEC * 2 + 1) + (P (1, VEC * 2 + 2 ). /(1 + exp (P (1, VEC * 2 + 3 ). /300 ). * (BP-P (1, VEC * 2 + 4); % + (RAND (0.5)-) * 10;

H = plot (X, Y, 'r * ', BP, HR,' B ');
Xlabel ('x ')
Ylabel ('logistic ')
Title ('4 parameter logistic function image and Sample comparison (p1 = 100; P2 = 200; P3 = 0.3; P4 = 75 )')
Legend ('samples', 'estimating functions ')

% ---------------------- Adaptive Value Function
Function Y = fitness (p, x, y)
% Least Squares estimation adaptive Function
% Take the square of the error as the adaptive value, with the goal of minimizing the adaptive Value

P_size = size (P );
X_size = size (X );
Y_size = size (y );

% Result
Y = zeros (p_size (1), 1 );

For I = 1: p_size (1) % for each particle
For j = 1: x_size (1) % for each sample point
Y (I) = y (I) + (Y (j)-(P (I, 1) + (P (I, 2 ). /(1 + exp (P (I, 3 ). /300 ). * (x (J, 1)-P (I, 4) ^ 2); % Y (I) for the I example, obtain the current adaptive value (the second multiplication Square and the error square)
End
End

% X (:, p_size (2) = 1; % extension Matrix
% Sum (P (I, :). * X (J ,:))

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.