Survival of the fittest is the universal law of nature, which is mainly achieved through selection and variation. Selection is the basic idea of optimization, while variation (diversity) is the basic idea of random search or undefined search. "Survival of the fittest" is the core of algorithm search. Different super-heuristic algorithms can be obtained based on different "survival of the fittest" strategies. The main idea of hyperheuristic algorithms comes from the human's careful observation and practice of physical, biological, and social natural phenomena for a long time, as well as a deep understanding of these natural phenomena, and gradually learning from nature, it is achieved by imitating the operating mechanism of natural phenomena.
Genetic algorithm: It simulates the selection, crossover, and mutation of gene chromosomes in the evolutionary process based on biological evolution. In the process of evolution, good individuals have a high chance of survival.
Simulated Annealing: simulates the crystallization process of solid matter in physics. In the process of annealing, if a good solution is found to be accepted, otherwise, a bad solution is accepted with a certain probability (that is, the idea of diversification or variation) to achieve the goal of jumping out of the local optimal solution.
Neural Network: simulates the process of neural processing in the brain. Through the competition and collaboration of each neuron, the process of selection and variation is realized.
Taboo Search: simulates human experience, uses a taboo table to remember historical information in the recent search process, and bans some solutions to avoid turning back and jumping out of the local optimal solution.
Ant Algorithm: simulates ant behavior, anthropomorphic objects, and learns from ANT collaborative approaches.
These super heuristic algorithms share a common feature: starting from a random initial feasible solution, the iterative improvement strategy is used to approximate the optimal solution of the problem.
Their basic elements: (1) Random Initial feasible solution;
(2) give an evaluation function (often related to the value of the target function );
(3) The neighborhood generates a new feasible solution;
(4) Selection and acceptance criteria;
(5) Termination criteria.
Among them, (4) reflects the ability of hyperheuristic algorithms to overcome local optimum.
Although people have been studying heuristic algorithms for nearly 50 years, there are still many shortcomings:
1. There is no unified and complete theoretical system for heuristic algorithms.
2. Due to NP theory, all kinds of heuristic algorithms inevitably encounter local optimum problems. How to judge
3. All kinds of heuristic algorithms have their own advantages and perfect combination.
4. Parameters in a heuristic algorithm play a vital role in the algorithm's performance and how to effectively set parameters.
5. the heuristic algorithm lacks effective iteration stop conditions.
6. Study on the convergence speed of heuristic algorithms.