Introduction to the method of mountain climbing
Mountain climbing (climbing method) is an optimization algorithm, which begins with a random solution and then gradually finds an optimal solution (local optimal). Assuming that there are multiple parameters for the problem, we can increase or decrease the value of a parameter by one unit in the process of getting the optimal solution through the climbing method. For example, the solution of a problem requires the use of 3 integer type parameters x1, x2, X3, initially set the values of these three parameters (2,2,-2), increase/decrease x1 by 1, get two solutions (1,2,-2), (3, 2,-2), increase/decrease x2 by 1, and get two solutions (2,3,-2), ( 2, 1,-2), increase/decrease x3 by 1, get two solutions (2,2,-1), (2,2,-3), and get a solution set:
(2,2,-2), (1, 2,-2), (3, 2,-2), (2,3,-2), (2,1,-2), (2,2,-1), (2,2,-3)
The optimal solution is found from the solution set above, then the optimal solution is constructed according to the above method, then the optimal solution is obtained, and then the "mountain climbing" is ended until the previous optimal solution is the same.
Ii. examples of Python
Set the equation y = x1+x2-x3,x1 is an integer in the interval [-2, 5], X2 is an integer in the interval [2, 6], and X3 is an integer in the interval [-5, 2]. Using the mountain climbing method, we find the solution that makes Y take the least value.
The code is as follows:
Copy Code code as follows:
Import Random
def evaluate (x1, x2, x3):
Return x1+x2-x3
if __name__ = = ' __main__ ':
X_range = [[-2, 5], [2, 6], [-5, 2]]
Best_sol = [Random.randint (x_range[0][0], x_range[0][1]),
Random.randint (X_range[1][0], x_range[1][1]),
Random.randint (X_range[2][0], x_range[2][1])]
While True:
Best_evaluate = Evaluate (Best_sol[0], best_sol[1], best_sol[2])
Current_best_value = Best_evaluate
Sols = [Best_sol]
For i in Xrange (Len (Best_sol)):
If best_sol[i] > x_range[i][0]:
Sols.append (best_sol[0:i] + [best_sol[i]-1] + best_sol[i+1:])
If best_sol[i] < x_range[i][1]:
Sols.append (best_sol[0:i] + [best_sol[i]+1] + best_sol[i+1:])
Print Sols
For S in Sols:
el = Evaluate (S[0], s[1], s[2])
If El < best_evaluate:
Best_sol = S
Best_evaluate = El
if best_evaluate = = Current_best_value:
Break
print ' Best sol: ', Current_best_value, Best_sol
The results of a run are as follows:
[[0, 5, 1], [-1, 5, 1], [1, 5, 1], [0, 4, 1], [0, 6, 1], [0, 5, 0], [0, + 5, 2]]
[[-1, 5, 1], [-2, 5, 1], [0, 5, 1], [-1, 4, 1], [-1, 6, 1], [-1, 5, 0], [-1, 5, 2]]
[[-2, 5, 1], [-1, 5, 1], [-2, 4, 1], [-2, 6, 1], [-2, 5, 0], [-2, 5, 2]]
[[-2, 4, 1], [-1, 4, 1], [-2, 3, 1], [-2, 5, 1], [-2, 4, 0], [-2, 4, 2]]
[[-2, 3, 1], [-1, 3, 1], [-2, 2, 1], [-2, 4, 1], [-2, 3, 0], [-2, 3, 2]]
[[-2, 2, 1], [-1, 2, 1], [-2, 3, 1], [-2, 2, 0], [-2, 2, 2]]
[[-2, 2, 2], [-1, 2, 2], [-2, 3, 2], [-2, 2, 1]]
Best Sol:-2 [-2, 2, 2]
It can be seen that the optimal solution is-2, corresponding x1, x2, x3 respectively Values-2, 2, 2.
Third, how to find the global optimal
The best solution obtained by mountain climbing method is local optimum, if we want to get better solution, we use mountain climbing algorithm (which needs to start from different initial solutions), find the optimal solution from multiple local optimal solutions, and this optimal solution may be the global optimal solution.
In addition, the simulated annealing algorithm is also an algorithm to find the global optimal solution.