Usually used for optimization problems, we make a set of choices to achieve the optimal solution. Each step pursues local optimization. The optimal solution can be obtained for many problems, and the speed is much faster than the dynamic programming method.
16.1 Activity Selection Questions
Sort by end time, and then select a compatibility activity.
theorem 16.1 Consider an arbitrary non-empty problem SK, so am is the end time of the SK in the first activity, then am in the SK of some of the largest compatible activity sub-set.
16.2 Greedy Algorithm principle
Design Greedy algorithm steps :
1. Transform the optimization problem into this form: Once a selection is made, only one sub-problem needs to be solved.
2 "after proving the greedy choice, the original problem always has the optimal solution, that is, the greedy choice is always safe."
3 "After proving the greedy choice, the remaining sub-problem satisfies the nature: its optimal solution and the greedy choice combination can get the optimal solution of the original problem, so the optimal substructure is obtained."
Greedy Choice Nature :
We can construct the global optimal solution by making local optimal (greedy) selection.
Dynamic Programming: The solution of dependent sub-problem. Bottom-up or top-down, you need to solve the sub-problem first.
Greedy algorithm: The choice may depend on the previous choice, but does not depend on the future choice or the solution of the sub-problem. No sub-problems are solved until the first selection is made. From top to bottom.
If we have to consider many choices when we make greedy choices, it usually means that we can improve the greedy choice and make it more efficient. Greedy selection is often made faster by preprocessing the inputs or by using the appropriate data structures (usually priority queues).
Optimal sub-structure :
If the optimal solution of a problem contains the optimal solution of its sub-problem, it is said that the problem has the optimal substructure property. This nature is a key element of the ability to apply dynamic programming and greedy algorithms.
greedy for dynamic planning :
0-1 knapsack problem (Dynamic programming) and fractional knapsack problem (greedy algorithm)
16.3-KHz Code
prefix code : No code word is the prefix of other code words. The prefix code can guarantee the optimal data compression rate.
The optimal coding scheme for a file always corresponds to a full binary tree. That is, each non-leaf node has two children nodes (domestic and foreign tutorial definitions are inconsistent, the country also requires a complete binary tree).
If c is the alphabet and so the characters appear in a positive frequency, the tree of the optimal prefix code is exactly | C| a leaf node, each leaf node corresponds to one character in the alphabet, and exactly | C|-1 an internal node.
lemma 16.2 c is an alphabet, where each character C belongs to C, a frequency c.freq. Make x and y the lowest frequency of two characters in C. Then there is an optimal prefix code for C, and the code for x and Y is the same length, and only the last bits is different.
lemma 16.3 C is an alphabet, where each character C belongs to C, a frequency c.freq. Make x and y the lowest frequency of two characters in C. C ' for C minus the characters X and Y, add a new character z after the resulting alphabet, that is, C ' =c-{x,y}u{z}. Similar to C, also for C ' definition freq, the difference is only z.freq=x.freq+y.freq. An encoding tree that corresponds to any of the optimal prefix codes for T ' C '. So we can replace the T ' middle node Z with an internal node with X and Y as the child, get the tree T, and t represents an optimal prefix code for the alphabet C.
16.4 quasi-array and greedy algorithm
16.5 solving task scheduling problem with quasi-array
Introduction to algorithms notes--16th greedy algorithm