1. Introduction
Climbing method is an optimization algorithm. It generally starts from a random solution and gradually finds an optimal solution (local optimization ). Assuming that there are multiple parameters for the question, we can increase or decrease the value of a parameter in sequence when we gradually obtain the optimal solution through climbing. For example, to solve a problem, you need to use three Integer Parameters x1, x2, and x3. At the beginning, set these parameters to (2, 2,-2 ), increase/decrease x1 by 1 to get two solutions (1, 2,-2), (3, 2,-2), and increase/decrease x2 by 1 to get two solutions (2, 3, -2), (,-2); increase/decrease x3 by 1 to obtain two solutions (,-1), (,-3 ), in this way, a solution set is obtained:
(2, 2,-2), (1, 2,-2), (3, 2,-2), (2, 2,-2), (2, 1,-2 ), (2, 2,-1), (2, 2,-3)
Find the optimal solution from the above solution set, and then construct a solution set based on the above method, and then find the optimal solution. In this way, it is not until the previous optimal solution is the same as the next optimal solution that ends "mountain climbing ".
Ii. Python instance
Let's take the equation y = x1 + x2-x3, x1 is the integer in the range [-2, 5], x2 is the integer in the range [2, 6], x3 is the interval [-5, an integer in 2. Use the climbing method to find the solution that minimizes the value of y.
The Code is as follows:
Copy codeThe Code is as follows:
Import random
Def evaluate (x1, x2, x3 ):
Return x1 + x2-x3
If _ name _ = '_ main __':
X_range = [[-2, 5], [2, 6], [-5, 2]
Best_sol = [random. randint (x_range [0] [0], x_range [0] [1]),
Random. randint (x_range [1] [0], x_range [1] [1]),
Random. randint (x_range [2] [0], x_range [2] [1])]
While True:
Best_evaluate = evaluate (best_sol [0], best_sol [1], best_sol [2])
Current_best_value = best_evaluate
Sols = [best_sol]
For I in xrange (len (best_sol )):
If best_sol [I]> x_range [I] [0]:
Sols. append (best_sol [0: I] + [best_sol [I]-1] + best_sol [I + 1:])
If best_sol [I] <x_range [I] [1]:
Sols. append (best_sol [0: I] + [best_sol [I] + 1] + best_sol [I + 1:])
Print sols
For s in sols:
El = evaluate (s [0], s [1], s [2])
If el <best_evaluate:
Best_sol = s
Best_evaluate = el
If best_evaluate = current_best_value:
Break
Print 'best sol: ', current_best_value, best_sol
A running result is as follows:
[[0, 5, 1], [-1, 5, 1], [1, 5, 1], [0, 4, 1], [0, 6, 1], [0, 5, 0], [0, 5, 2]
[[-1, 5, 1], [-2, 5, 1], [0, 5, 1], [-1, 4, 1], [-1, 6, 1], [-1, 5, 0], [-1, 5, 2]
[[-2, 5, 1], [-1, 5, 1], [-2, 4, 1], [-2, 6, 1], [-2, 5, 0], [-2, 5, 2]
[[-2, 4, 1], [-1, 4, 1], [-2, 3, 1], [-2, 5, 1], [-2, 4, 0], [-2, 4, 2]
[[-2, 3, 1], [-1, 3, 1], [-2, 2, 1], [-2, 4, 1], [-2, 3, 0], [-2, 3, 2]
[[-2, 2, 1], [-1, 2, 1], [-2, 3, 1], [-2, 2, 0], [-2, 2, 2]
[-2, 2, 2], [-1, 2, 2], [-2, 3, 2], [-2, 2, 1]
Best sol:-2 [-2, 2, 2]
We can see that the optimal solution is-2, and the corresponding x1, x2, and x3 values are-2, 2, and 2, respectively.
3. How to find global optimization
The best solution obtained by the climbing method may be local optimum. If a better solution is to be obtained, the climbing algorithm is used for multiple times (the climbing algorithm needs to start from different initial solutions ), find the optimal solution from multiple local optimal solutions, which may also be the global optimal solution.
In addition, the simulated annealing algorithm is also an algorithm that tries to find the global optimal solution.