8.4 greedy

Source: Internet
Author: User

1. Backpack problems:

① Optimal loading problem: the weight of an object is arranged from small to large, and each object is selected in sequence, but the optimal solution can be obtained.

② Some knapsack problems: sort the "value divided by weight" of an object from small to large, and select each object at a time (greedy can only be greedy for one variable, which is a clever conversion ).

③ Boat problem: only minimize waste. (The minimum waste is the minimum waste !)

2. interval-related questions (Sorting: Left or right ?) :

① Select the non-intersection interval:

② Interval selection problem:

③ Range coverage:

3. Definition:

When solving a problem, we always make the best choice for the moment. That is to say, without considering the overall optimization, what he makes is only a local optimal solution in a certain sense.

Its core is to select a measurement standard based on the question. Then, the multiple inputs are arranged in the order required by this measurement standard, and a quantity is input at a time in this order. If this input cannot produce a feasible solution when combined with some of the best solutions currently in this measurement sense, this input is not added to this decomposition. Generally used in combination with sorting, the selection of key variables in sorting is the top priority.

There are usually several metrics for a given problem. In the beginning, it seems that these measurements are desirable, but in fact, the optimal solution obtained by greedy processing with most of these measurements is not the optimal solution of the problem, but a sub-optimal solution. Therefore, it is the core of greedy algorithms to choose the optimal measurement criteria that can generate the optimal solution.
4. Problem features: the premise for greedy policies is that local optimal policies can produce global optimal solutions. (1) As the algorithm progresses, two other sets will be accumulated: one containing the candidate objects that have been considered and selected, and the other containing the candidate objects that have been considered but discarded.
(2) There is a function to check whether a set of candidate objects provides answers to the problem. This function does not consider whether the solution at this time is optimal.
(3) another function checks whether a set of candidate objects is feasible, that is, whether it is possible to add more candidate objects to the set to obtain a solution. Like the previous function, the optimization of the solution is not considered at this time.
(4) The selection function can indicate which of the remaining candidate objects has the most promising solution to the problem.
Finally, the target function provides the solution value.
To solve the problem, worker needs to find a set of candidate objects that constitute the solution. It can optimize the target function and greedy algorithms step by step. At first, the set of candidate objects selected by the algorithm is empty. In each of the following steps, based on the selection function, the algorithm selects the objects most likely to constitute a solution from the remaining candidate objects. If the object cannot be added to the set, the object is discarded and not considered. Otherwise, the object is added to the set. Expand the set every time and check whether the set constitutes a solution. If the greedy algorithm works correctly, the first solution is usually optimal.

5. Important nature: ① greedy choice nature:
The greedy choice attribute means that the overall optimal solution of the problem can be selected through a series of local optimal choices. In other words, when considering the selection, we only consider the best choice for the current problem, not the results of the subproblem. This is the first basic element for the feasibility of greedy algorithms. For a specific problem, to determine whether it has the greedy choice nature, it must prove that the greedy choice made by each step ultimately leads to the overall optimal solution of the problem.

② Optimal sub-structure nature:
When the optimal solution of a problem includes the optimal solution of its subproblems, the problem is called an optimal sub-structure. The optimal sub-structure of a problem is the key feature of the problem that can be solved by greedy algorithms.
6. Haffman encoding: ① this method constructs the shortest average length code word of different character headers Based on the probability of occurrence of characters. Sometimes it is called the best encoding, which is generally called Huffman encoding.
② Place each character in a tree set as a single idea tree, and the weight of each subtree is equal to the frequency of the corresponding characters. The two Subtrees with the minimum weight are merged into a new tree and put into the set again. The weight of the new tree is equal to the sum of the values of the two Subtrees.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.