Transfer from http://blog.csdn.net/lvhao92/article/details/50826709
First of all, these three algorithms are the best way to find the cost function.
Climbing method: The cost function is abstracted into a mountain (imagine a 2-D coordinate system, a variable axis, a vertical axis as a cost function, and a cost function that rises and falls as the horizontal axis increases, as if it were a mountain), and someone can move around the mountain in any position (taking a point in the function), so As someone changes in the horizontal direction (variable), the man's altitude is also changing (the cost function varies with the variable). Unfortunately, the man was bent on going to the bottom of the mountain. So he always likes to go downhill, once found in all directions and then walk uphill, then this guy thinks he finally went to the bottom of the mountain, he no longer go and return to his position at this time. (The cost function for this example is only related to one variable, but in real life, cost functions are related to multiple variables.) The same is true, as if the man had n directions for his choice every time he walked (n variables).
It's easy for a man to see that it's a good man. The local minimum is considered to be the global minimum value. Thought the small valley of the mountain is the lowest place of the whole mountain (this mountain stretches, not the kind that connects the Earth at two ends, is the kind of length infinitely long), too navie.
Is there any way to do it? is to randomly repeat the mountain climbing method, so that each time you start the initial position randomly to try several times. Maybe it really gets the right result.
simulated annealing algorithm: this man suddenly become clever, think sometimes step back to the sky, I have some time to go a little uphill, I mean there will be a big downhill waiting for me, so this guy started as long as the ascent is not particularly outrageous, he will try to try, walk to see. However, with the passage of time, the man began to be more and more reluctant to go uphill, in the beginning it may be uphill and will try to walk. Later, more and more reluctant to try. This is willing to try to walk the mentality of the ascent with the freshly baked iron in the air, with the passage of time and gradually cooling, gradually cooled, gradually annealing. The formula says it is:. If the new cost function is lowered, of course, accept the start without saying much. But if the new cost function increases, then start to consider whether to try to do not go, at first, the temperature is very high, the difference between the cost is very small, in addition to a temperature close to 0, the P-value is close to 1, the general program is a random number between 0 and 1 and p-value comparison, if it is smaller than P If it's bigger than P, try not to. So obviously, at first it must be willing to try, and then, as time went on, the temperature dropped and the P-value was getting smaller and closer to 0. Therefore, more reluctant to try to climb the uphill.
The problem with this method is actually similar to the mountain climbing method, each time the result may be different, try to change the parameters (initial temperature and temperature drop speed) to try.
Genetic algorithm: at this time, imagine human. Human living environment is very cruel, only a good one to survive (excellent for the cost function of the optimal solution), generation, each generation will have a mutation (small, simple, random changes to the existing solution) will also have a cross (choose the best solution in the two solutions, Then combine them in some way). It is clear that mutations and intersections produce new populations (which can have an effect on the cost function or increase or decrease). Likewise, some of these new populations will survive in adapting to the world and some will disappear into the evolutionary stream of human beings. is the so-called survival of the fittest. It's cruel.
A few words to describe the mountain climbing method, simulated annealing, genetic algorithm