Dynamic Programming Introduction

Source: Internet
Author: User

Optimal substructure

In computer science, a problem are said to having optimal substructure if an optimal solution can be constructed efficiently From Optimal solutions of its subproblems. The used to determine the usefulness of dynamic programming and greedy algorithms for a problem.

Typically, a greedy algorithm is used to solve a problem with optimal substructure if it can be proved by induction C0>this is optimalat each step. Otherwise, providing the problem exhibits overlapping subproblems as well, dynamic programming is used. If there is no appropriate greedy algorithms and the problem fails to exhibit overlapping subproblems, often a lengthy bu T straightforward search of the solution space is the best alternative.

Overlapping subproblems

In computer Science, A problem is said to have  overlapping subproblems  if The problem can is broken down into subproblems which is reused several times or a recursive algorithm F Or the problem solves the same subproblem over and over rather than always generating new subproblems.

For example, the problem of computing the Fibonacci sequence exhibits overlapping subproblems. The problem of computing the nth Fibonacci number F(n), can is broken down into the subproblems of computing F(n − 1) and F(n − 2), and then adding the.  The Subproblem of computing F(n − 1) can itself be broken to a subproblem that involves computing F (n − 2). Therefore the computation of F(n − 2) is reused, and the Fibonacci sequence thus exhibits Overlappin G Subproblems.

A naive recursive approach to such a problem generally fails due to an exponential complexity. If The problem also shares an optimal substructure property, dynamic programming is a good the-to-work it out.

Dynamic ProgrammingNot to is confused with Dynamic programming language.

In mathematics, computer, economics, and bioinformatics, dynamic programming are a method for solving complex p Roblems by breaking them to simpler subproblems. It is applicable to problems exhibiting the properties of overlapping subproblems and optimal substructure. When applicable, the method takes far less time than naive methods that don ' t take advantage of the Subproblem overlap (like Depth-first search).

The idea behind dynamic programming are quite simple. In general, to solve a given problem, we need to solve different parts of the problem (subproblems), then combine the Solu tions of the subproblems to reach an overall solution. Often when using a more naive method, many of the subproblems is generated and solved many times. The dynamic programming approach seeks to solve each subproblem only once, thus reducing the number of computations:once The solution to a given subproblem have been computed, it's stored or "memo-ized": The next time the same solution is need Ed, it's simply looked up. This approach was especially useful when the number of repeating subproblems grows exponentially as a function of The size of the input.

Dynamic programming algorithms is used for optimization (for example, finding the shortest path between the points, or The fastest to multiply many matrices). A dynamic Programming algorithm would examine the previously solved subproblems and would combine their solutions to give th e best solution for the given problem. The alternatives is many, such as using A greedy algorithm, which picks the locally optimal choice at all branch in The road. The locally optimal choice is a poor choice for the overall solution. While a greedy algorithm does isn't guarantee an optimal solution, it's often faster to calculate. Fortunately, some greedy algorithms (such as minimum spanning trees) is proven to leads to the optimal solution.

For example, let's say so you have the to get from point A to point B as fast as possible, in A given city, during Rush H Our. A dynamic programming algorithm would look at finding the shortest paths to points close to A, and use those solutions to E Ventually find the shortest path to B. On the other hand, a greedy algorithm would start you driving immediately and would pick the road that looks the fastest at Every intersection. As can imagine, this strategy might not leads to the fastest arrival time, since your might take some ' easy ' streets and Then find yourself hopelessly stuck in a traffic jam.

Sometimes, applying memoization to a naive basic recursive solution already results in an optimal dynamic programming Solu tion; However, many problems require more sophisticated dynamic programming algorithms. Some of these is recursive as well and parametrized differently from the naive solution. Others can is more complicated and cannot is implemented as a recursive function with memoization. Examples of these is the solutions to the EGG dropping puzzle below.

Dynamic Programming Introduction

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.