DP Getting Started--a dynamic programming with the problem of split bar

Source: Internet
Author: User
Brief introduction

If you often brush leetcode, you will find many problems with dynamic programming tags. In fact, there are 115 topics with DP tags, most of which are medium and difficult, accounting for 12.8% of all topics (September 2018), which is the second largest problem.

If we can systematically study the topic of DP, I believe it will greatly improve the speed of solving problems, and also help to solve the practical problem in the future.

This paper introduces the algorithm motive, core idea and two common methods of dynamic programming based on the point of split bar.

Split bar problem

The rod-cutting problem (split bar problem) is a typical example of a dynamic programming problem.

Given a rod of length n (n is an integer), the rod can be cut to any length of a different rod (which contains no cutting at all), and each part is an integer. Given an array of integers p[],p[i] represents the price of a rod of length I, asking how to cut the rod can make the most profit.

An example of an p[] array is as follows:

Ideas

An integer cut on a pole of length n, a total of 2n-1 , because there are n-1 points to choose whether or not to cut.

The points that can be cut are numbered in three-way, ..., n-1, if you try to cut at 1, the lever becomes two segments of length 1 and n-1, and if you try to cut at 2, the bar becomes 2 segments of length two and n-2, and so on, and so on, a total of n methods (including no cutting at all). In this way, we take the first step of recursion, which is to divide the optimal cut of the rod with a length of n into two sub-problems: the optimal cutting of the rod with a length of I and the optimal cutting of the rod with a length of n-i (i =,..., n). The final profit is two sub-par profit and.

If you use FN to indicate the maximum profit that can be obtained after the pole cut of length n, after the above analysis, we can obtain the profit and maximum value of two sub-rods. That

fn = max (pn, F1 + fn-1, f2 + fn-2, ..., fn-1 + F1).

This idea is correct, but not too good, the conscientious can notice there is a big overlap between the sub-problems, such as the calculation fn-1 will need to see F1 + fn-2, that is F1 + fn-1 This sub-problem needs to see F1 + F1 + fn-2 This cut method, and the calculation F2 need to see F1 + F1, That is, F2 + fn-2 This sub-problem will also be seen F1 + F1 + fn-2 This cut, the equivalent of some of the possibility of repeated viewing multiple times.

A more concise and reasonable idea is: Set the left side of the long as I can not be cut, only the right length of the n-i is able to cut the rod. The problem becomes

fn = max (pi + fn-i), i =,..., n

Traditional recursive implementations

According to the above analysis, a recursive implementation can be made as follows:

1 intCutrod (intNint[] p) {2     if(n = = 0)3         return0;4     intMax =Integer.min_value;5      for(inti = 1; I <= N; i++)6max = Math.max (max, p[i] + cutrod (n-I, p));7     returnMax;8}
The complexity of traditional recursive implementations

In node n, the time complexity of the algorithm is

Tn = 1 +∑ti (i = 0,1, ..., n-1)

(1 of which is the constant complexity of adding and max operations at the node)

This is a good calculation, as long as the value of TI in this from the back to the forward generation:

Tn = 1+t0+t1+ ... +tn-1 = 1+t0+t1+ ... +tn-2+ (1+t0+t1+ ... +tn-2)

= 2 (1+t0+t1+ ... +tn-2) = 2 (1+t0+t1+ ... +tn-3+ (1+t0+t1+ ... +tn-3))

= (1+t0+t1+ ... +tn-3) = ... (Summary rule) = 2n-1 (1 + T0)

= 2n

That is, the time complexity of the traditional recursive algorithm is O (2n), which is the exponential level.

In the previous section, there is a common 2n-1 of the possibility of cutting, which means that the recursive algorithm iterates through each possibility. Is there any possibility of optimization?

Optimization

Take n = 4 as an example to draw a recursive tree structure (the node contains a number of n values)

This figure is from the introduction to the algorithm (English version) 3rd Ed. 15.1 P346

You can see that there is overlap between the sub-trees. The most obvious is that the sub-problem of n = 2 and the sub-Tree of n = 3 Call the subtree exactly the same, two times the same calculation. And this subtree contains a subtree of n = 1, which means that the magnitude of the waste is multiplied.

One of the optimization ideas is to record the results of each sub-problem, the next time you encounter the same problem directly using the record value, this is the core idea of dynamic planning.

Dynamic planning

As described in the previous section, dynamic programming is a "space-for-time" idea, which is applicable to the optimization problem of overlapping between sub-problems . Its basic idea is to record the answers to the calculated sub-problems so that each sub-problem is only counted once.

The implementation method of dynamic programming is divided into two kinds of top-down and bottom-up, which can be understood as recursive call from the root node of recursive tree, and the latter from the leaf node of the tree to continuously upward cycle.

Top-down with Memoization

Top-down method is easy to understand, is to add memoization on the basis of traditional recursion ( pay attention to the area of memorization . Memoization from memo, with the meaning of memo), that is, using arrays or tables and other structures to cache the results. In each recursive operation, first determine whether the desired result is in the cache, and if it is not, then the operation is made and stored in the cache.

1 intCutrod (intNint[] p) {2     int[] Memo =New int[n + 1];3      for(inti = 0; i < memo.length; i++)4Memo[i] = Integer.min_value;//initialization5     returnCutrod (n, p, memo);6 }7 8 intCutrod (intNint[] P,int[] Memo) {9     if(memo[n]! = integer.min_value) Ten          return memo[n];  //return value directly if memoized One     if(n = = 0) A         return0; -     intMax =Integer.min_value; -      for(inti = 1; I <= N; i++) themax = Math.max (max, p[i] + cutrod (n-I, p, Memo)); -     memo[n] = max; //Memoize It -     returnMax; -}
Bottom-up with Tabulation

Rather than top-down,bottom-up, the problem is solved by using loops instead of recursion, and then solving the parent problem with the answer to the sub-problem. Tabulation is also well understood, that is, to use a table to store the answer to the sub-problem, and then look up the table to get all the information required by the parent problem to solve the parent problem, and then fill in the table until the table is filled.

In fact, this puzzling name of dynamic programming comes from this. Programming has the meaning of "list method" in mathematics, which means to find the maximum/minimum value of a function, to list all possible values of all the variables of a function in a table and to perform certain operations on the table to obtain the result. Here, the table is "static", the information in each lattice is independent, whereas in dynamic programming, the table is "dynamic", and some of the information in the lattice depends on the computational answers in other squares. Therefore, the dynamic programming can also be understood as the " Dynamics list Method ", which is the tabulation here.

The implementation of Top-down is as follows:

1 intCutrod (intNint[] p) {2     int[] table =New int[n + 1];3      for(intj = 1; J <= N; J + +) {//Fill table from j = 1 to n4         intMax =Integer.min_value;5          for(inti = 1; I <= J; i++)6max = Math.max (max, p[i] + table[j-i]);//calculate F (j)7TABLE[J] =Max;8     }9     returnTable[n];Ten}
Complexity of

In the bottom-up solution, we fill in the table from 1 to N, and when we fill in the table[j], we need to query all the elements of table[j-1] to table[0], that is to do a J-query. Then fill in the table to do 1+2+3+...+n = O (n2) query. The time complexity of bottom-up solution is O (n2).

In the Top-down solution, the complexity can be analyzed: first, because of the caching mechanism, each sub-problem will only be computed once; to solve the problem of size n, we need to calculate the size of 0,1,2,...,n-1 (line 15th) , the calculation of the size of n is an n-th calculation (line 14th), so the complexity of the Top-down solution is also O (n2).

In fact, the dynamic programming simplifies the recursive tree in the previous diagram, merges the overlapping subtree, and obtains a sub-problem tree. The edges and nodes in the sub-problem tree are reduced, which means that the complexity of the time is optimized.

-

Concept

After reading the example, we will summarize the concept of dynamic programming algorithm.

Dynamic programming
    • Dynamic programming is a class of algorithms commonly used to solve optimization problems . Its greatest feature is the use of sub-problem information to help solve the parent problem , so that the difficulty of solving problems reduced. For example, the question of 1,000,002 Fibonacci numbers seems complicated, but if you know the Fibonacci numbers of 1,000,000 and 1,000,001, things are much simpler.
    • There is a notable feature of the dynamic programming problem, which is that there is overlapping between the sub-problems . Dynamic programming by recording the results of sub-problems, to ensure that each sub-problem is calculated only once, reducing the time wasted. The time complexity of dynamic programming is usually polynomial complexity (that is, O (NK), K is nonnegative constant), and the algorithm that does not record the result is usually much higher than the dynamic programming to achieve exponential complexity because of the repeated computation.
    • Dynamic programming This English noun is a bit difficult to understand. In fact, the programming here does not refer to programming, but rather a mathematical solution to the optimization problem, called list (tabular method), the approximate process is to list the different variables of the function in the table and perform various operations on the table to obtain the results. If the list method is static, the dynamic programming algorithm, the table is slowly growing, first solve the relatively simple sub-problem, and then through the combination of sub-problems to find the parent problem, so that the table appears to be "dynamic". This is the meaning of dynamic programming.
Memoization vs. Tabulation Introduction
    • Dynamic programming usually has two methods:top-down and bottom-up.
    • Top-down usually appear recursively , starting with the parent problem and solving the sub-problem recursively. Top-down's solution process is usually combined with memoization , and the computed results are cached in arrays or hash tables. When entering a recursive solution problem, first look at whether there is a result in the cache, and if so, return the cached result directly.
    • Bottom-up usually appear in the form of loops . The solution process of bottom-up is usually combined with tabulation , that is, to solve the smallest sub-problem first, the result is recorded in the table (usually one-dimensional or two-dimensional array), solve the parent problem by directly looking up the table to get the results of the sub-problem, and then the result of the parent problem is also filled in the table, Until the table fills up, the final entry is the result of the starting question.
Resources

Dynamic Programming-Wikipedia

Introduction to Algorithms (English version) 3rd Ed. 15.1

What is dynamic programming? --StackOverflow

Tabular Method of minimisation

The logically of digital images (list method)

Dynamic programming and memoization:bottom-up vs Top-down approaches

DP Series NEXT: DP Methodology--Learning DP problem solving by multiplication of matrices

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.