Introduction to the method of climbing a mountain
Mountain climbing (climbing method) is an optimization algorithm, which usually starts with a random solution and then finds an optimal solution (local optimal). Assuming that the problem has multiple parameters, we can gradually obtain the optimal solution by the mountain climbing method, in order to increase or decrease the value of a parameter in one unit. For example, the solution of a problem requires the use of 3 integer-type parameters x1, x2, X3, starting with the three parameters set to the value (2,2,-2), x1 Increase/decrease 1, get two solutions (1,2,-2), (3, 2,-2), x2 Increase/decrease 1, get two solutions (2,3, 2), ( 2, 1,-2); increase/decrease x3 by 1, get two solutions (2,2,-1), (2,2,-3), and get a solution set:
(2,2,-2), (1, 2,-2), (3, 2,-2), (2,3,-2), (2,1,-2), (2,2,-1), (2,2,-3)
The optimal solution is found from the above solution, then the optimal solution is constructed according to the above method, and then the optimal solution is obtained, so that the optimal solution of the previous one is the same as that of the last one before the "mountain climbing" is completed.
Ii. examples of Python
Set the equation y = x1+x2-x3,x1 is an integer in the interval [-2, 5], X2 is an integer in the interval [2, 6], and X3 is an integer in the interval [-5, 2]. Use the mountain climbing method to find the solution that minimizes the value of Y.
The code is as follows:
The code is as follows:
Import Random
def evaluate (x1, x2, x3):
Return x1+x2-x3
if __name__ = = ' __main__ ':
X_range = [[-2, 5], [2, 6], [-5, 2]]
Best_sol = [Random.randint (x_range[0][0], x_range[0][1]),
Random.randint (X_range[1][0], x_range[1][1]),
Random.randint (X_range[2][0], x_range[2][1])
While True:
Best_evaluate = Evaluate (Best_sol[0], best_sol[1], best_sol[2])
Current_best_value = Best_evaluate
Sols = [Best_sol]
For i in Xrange (Len (Best_sol)):
If best_sol[i] > x_range[i][0]:
Sols.append (best_sol[0:i] + [best_sol[i]-1] + best_sol[i+1:])
If best_sol[i] < x_range[i][1]:
Sols.append (best_sol[0:i] + [best_sol[i]+1] + best_sol[i+1:])
Print Sols
For S in Sols:
el = Evaluate (S[0], s[1], s[2])
If El < best_evaluate:
Best_sol = S
Best_evaluate = El
if best_evaluate = = Current_best_value:
Break
print ' Best sol: ', Current_best_value, Best_sol
A running result is as follows:
[[0, 5, 1], [-1, 5, 1], [1, 5, 1], [0, 4, 1], [0, 6, 1], [0, 5, 0], [0, 5, 2]
[[-1, 5, 1], [-2, 5, 1], [0, 5, 1], [-1, 4, 1], [-1, 6, 1], [-1, 5, 0], [-1, 5, 2]]
[-2, 5, 1], [-1, 5, 1], [-2, 4, 1], [-2, 6, 1], [-2, 5, 0], [-2, 5, 2]]
[-2, 4, 1], [-1, 4, 1], [-2, 3, 1], [-2, 5, 1], [-2, 4, 0], [-2, 4, 2]]
[-2, 3, 1], [-1, 3, 1], [-2, 2, 1], [-2, 4, 1], [-2, 3, 0], [-2, 3, 2]]
[-2, 2, 1], [-1, 2, 1], [-2, 3, 1], [-2, 2, 0], [-2, 2, 2]]
[-2, 2, 2], [-1, 2, 2], [-2, 3, 2], [-2, 2, 1]]
Best Sol:-2 [-2, 2, 2]
It can be seen that the optimal solution is-2, the corresponding x1, x2, x3 respectively Value-2, 2, 2.
Third, how to find the global optimal
The optimal solution obtained by mountain climbing method may be local optimal, if we want to get better solution, we use the mountain climbing algorithm multiple times (need to start from different initial solution), find out the optimal solution from multiple local optimal solutions, and this optimal solution is probably the global optimal solution.
In addition, the simulated annealing algorithm is an algorithm to find the global optimal solution.