Stanford University Wolpert and Macready proposed the NFL theorem, which is an important theoretical research achievement in the field of optimization, which has far-reaching significance. The conclusions are summarized as follows:
Theorem 1 with arbitrary (random or definite) algorithms of both A and B, the average performance of all problem sets is the same (performance can be measured in a variety of ways, such as optimal solution, convergence rate, etc.)
c is the probability curve of the individual fitness, F is the fitness function, n is the group size.
For these conclusions, Radcliffe and Surry also have the same conclusion. For example, if the evolutionary algorithm solves the problem set a with better performance than simulated annealing, then the performance of the simulated annealing solution of problem set B is better than that of the evolutionary algorithm. In all cases, the performance of the two algorithms is the same, and even no algorithm is better than a random search algorithm. According to the NFL theorem, the performance of the algorithm is "good" and "bad" is related to a certain problem, and it is related to the probability curve C of individual fitness, obviously only know C can evaluate the performance of the algorithm.
For example, multiobjective coordinated evolutionary algorithm (MOCEA) is especially effective for solving high-dimensional multi-objective optimization problems, but it is not necessarily superior to other existing multi-objective evolutionary algorithms because of its complex algorithm and numerous computations.