Particle swarm Algorithm (4)----particle swarm algorithm classification

Source: Internet
Author: User
Tags mixed

Particle swarm algorithm is mainly divided into 4 large branches:

(1) The deformation of the standard particle swarm algorithm

In this branch, the main is to the standard particle swarm optimization algorithm inertia factor, The convergence factor (constraint factor), "The Cognition" part C1, "The Society" part's C2 carries on the change and the adjustment, hoped obtains the good effect.

The original version of the inertia factor remains unchanged, and it was later suggested that the inertia factor needed to be reduced gradually as the algorithm was iterated. At the beginning of the algorithm, the large inertia factor can be the algorithm is not easy to fall into the local optimal, to the later stage of the algorithm, small inertia factor can accelerate the convergence speed, so that convergence more stable, do not appear oscillation phenomenon. After my test, the dynamic reduction of inertia factor W, it can make the algorithm more stable, the effect is better. But what is the method of decreasing inertia factor? The first thing people think about is linear decrement, which is really good, but isn't it the best? Therefore, some people on the decline of the strategy of research, the results showed that: linear function is better than the reduction of the convex function of the strategy, but the decline of the concave function is better than linear decline, after my test, the experimental results are basically consistent with this conclusion, but the effect is not very obvious.

For the convergent factor, it is proved that if the convergent factor is 0.729, the convergence of the algorithm can be ensured, but the algorithm converges to the global optimum, and the convergence factor of 0.729 is better. The c2,c1 of the social and cognitive coefficients has also been raised: C1 first big and then small, and C2 thought, because at the beginning of the algorithm operation, each bird must have a large part of their own cognitive and relatively small social parts, this with our own group of people looking for something closer, because in the beginning of our search for things, We basically rely on our own knowledge to find, and later, we accumulated more and more experience, so we began to reach consensus (social knowledge), so we began to rely on social knowledge to find things.

In 2007, two scholars in Greece proposed a method of combining the global version of the fast convergence with particle swarm optimization, which is not easy to fall into local optimal local version, using the formula

V=N*V (Global Version) + (1-N) *v (local version) Speed Update formula, V represents speed

W (k+1) =w (k) +V position update formula

In the paper, we discuss the situation of the coefficients n taking different situations, and run it 20,000 times to analyze the results of various coefficients.

(2) hybrid of particle swarm optimization algorithm

This branch mainly mixes the particle swarm algorithm with the algorithms, and some people combine it with the simulated annealing algorithm, some of which are mixed with the simplex method. But the most is to mix it with the genetic algorithm. According to the genetic algorithm three different operators can generate 3 of different hybrid algorithms.

The combination of particle swarm algorithm and selection operator, where the idea is mixed: in the original particle swarm algorithm, we chose the optimal value of the particle swarm as Pg, but the combined version gives each particle a selected probability based on the size of the fitness of all the particles, The particles are then selected on the basis of probability, and the selected particles as Pgare unchanged. Such an algorithm can maintain the diversity of the particle swarm during the operation of the algorithm, but the fatal disadvantage is the slow convergence rate.

The combination of particle swarm algorithm and hybrid operator, combined with the basic idea of genetic algorithm, in the operation of the algorithm according to the size of fitness, particles can be 22 hybridization between, such as a very simple formula

W (new) =nxw1+ (1-n) xw2;

W1 and W2 are the father particles of this new particle. This algorithm can introduce new particles in the operation of the algorithm, but once the algorithm falls into local optimization, the PSO algorithm will be difficult to get rid of the local optimal.

The combination of particle swarm algorithm and mutation operator combines the idea of testing all particles with the current optimum distance, and when the distance is less than a certain number, a particle of a percentage of all particles (such as 10%) can be randomly initialized to allow the particles to find the optimal value again.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.