Greedy Law _2 (draft)

Source: Internet
Author: User


51Nod Greedy Starter Tutorial _ Task Execution Order

There are n tasks that need to be executed, the first task is calculated as r[i] space, and then a portion is released
Final storage calculation results need to occupy O[i] space (O[i] < r[i])

For example:
5 Space required for execution, 2 space for final storage
Give the space required to execute and store n tasks, and ask for the minimum amount of space required to perform all tasks


Analysis:
The subject can be abstracted, starting with an integer, subtracting a at a time, plus B (A, a, a positive number), requiring no negative numbers for each operation

For the subject a[i] = R[i], b[i] = R[i]–o[i], note o[i] < R[i], we have 0<b[i]<a[i]
So although every time there is a reduction, but the addition does not reduce the more, the total is still shrinking
The point is, we're going to be the "best." Order of execution
You can try a variety of greedy strategies

We give the standard answer--according to B[i] The order is not increased, is the most "advantageous"

In order to define "favorable", we have thus proved our conclusion:

If for b[0]>=b[1] >=...>=b[x] < b[x + 1]
(A[0],b[0]) .... (A[x], b[x]) (a[x + 1], b[x + 1]) The combination can not produce negative numbers, then we exchange b[x] and B[x + 1] can also not produce negative numbers

Prove:
Exchange (A[x], b[x]) and (a[x + 1], b[x + 1]) are more advantageous for x + 1, because each parenthesis is actually a negative number, so the earlier the parentheses are arranged, the greater the number of meiosis, the less likely it is to form a negative number.
Key See (A[x],b[x]) move to the back will not produce negative

That's actually looking at the results before-a[x + 1] + b[x + 1]–a[x] will produce negative numbers
(Note-a[x + 1] + b[x + 1] will not produce a negative number, as we have just proved that the X + 1 is more favorable)

And we know that previous results-a[x] + b[x]–a[x + 1] do not produce negative numbers (because that's what we assume)
and B[x + 1] > B[x], so the former is larger, so-a[x + 1] + b[x + 1]–a[x] will not produce negative numbers

Therefore we have proved that after the exchange still does not produce negative numbers, that is, the original does not produce negative numbers, we still do not produce negative numbers after exchange

And after a couple of such exchanges, we're sure to swap the sequence in order of B's.
Thus, we prove that any feasible scheme is not good enough to follow the sequence of sequences in order B, which proves that our greedy strategy is effective.

It's a weird strategy--we're only thinking about B, we can get the best strategy.
Visible greedy algorithm or need to feel, bold hypothesis, careful verification


Input
Line 1th: 1 number N, indicating the number of tasks. (2 <= N <= 100000)
2-n + 1 lines: 2 numbers per line r[i] and O[i], respectively, to perform the required space and storage space required. (1 <= O[i] < R[i] <= 10000)

Output
Output the minimum space required to perform all tasks

Input example
20
14 1
2 1
11 3
20 4
9 {
6 5
20 7
19 8
9 4
20 10
18 11
12 6
13 12
14 9
15 2
16 15
17 15
19 13
20 2
20 1

Output example
135


#include <iostream> #include <cstdio> #include <cstring> #include <cstdlib> #include < algorithm> #include <queue> #include <vector> #include <stack>using namespace Std;typedef long Long Ll;typedef unsigned long long ull;typedef unsigned int uint;const int INF = 0x3f3f3f3f;const int MAXN = 1e5 + 10;struct No de {int R, O, b;}; Node node[maxn];int cmp_num (const void* A, const void* b), int main () {#ifdef __air_h freopen ("In.txt", "R", stdin), #endi    f//__air_h int N;    scanf ("%d", &n); for (int i = 0; i < N; ++i) {scanf ("%d%d", &node[i]. R, &node[i].        O); node[i].b = Node[i]. R-node[i].    O    } int Max = 0;    Qsort (node, N, sizeof (Node[0]), cmp_num);    int now_v = 0; for (int i = 0; i < N; ++i) {now_v + = Node[i].        R        if (Now_v > Max) {max = Now_v;    } Now_v-= node[i].b;    } printf ("%d\n", Max); return 0;} int Cmp_num (const void* A, const void* b) {Node* x = (node*) A;    Node* y = (node*) b; Return (Y-&GT;B-X-&GT;B) > 0? 1:-1;}


51Nod Greedy Classic Algorithm _ Greedy algorithm overview


Greedy algorithm (also known as greedy algorithm) refers to when solving problems, always make the best choice at present

In other words, not considering the overall optimality, what he is doing is the local optimal solution in a sense

Greedy algorithm is not to all problems can get the overall optimal solution, the key is the choice of greedy strategy, the choice of greedy strategy must have no effect

That the previous procedure in a State does not affect the future state, only the current state


Greedy algorithm has the structure of optimal sub-problem, it is characterized by "short-sighted", each time chooses the most advantageous decision for the current situation, to step by step to obtain the optimal solution

I personally think that greed is not a specific method, but a kind of method, the key to the greedy algorithm is not to think about, but lies in the proof of correctness

To prove that a greedy algorithm is correct, we need to prove that we can gradually transform an optimal solution into the solution we get with the greedy algorithm, and the solution will not be worse.

This proves that the greedy algorithm is as good as the optimal solution (obviously, the optimal solution cannot be better)

And to prove a greedy algorithm is wrong, just need to find a counter-example on it can be

Usually, proving that the greedy algorithm is correct or finding a counter-example of a greedy algorithm is not so easy

And even for the same problem, the correctness of the greedy algorithm from different angles is not the same.

For example, the Dijkstra algorithm is a famous greedy algorithm for finding the shortest path of single source graph.

If we also give a greedy algorithm, starting from the source each time the shortest side to continue to walk, until the end, until after all points or no way to go

According to our algorithm, the shortest path from a to C is a-b-c, its length is 5, and obviously a-d-c is the real shortest.

Our greedy algorithm is wrong.

So generally for a problem, we only talk about such a greedy algorithm is wrong, and do not say that the problem can not be used greedy algorithm-because may be designed from another angle greedy algorithm is correct


A problem even if you can not use a greedy algorithm, but also through the greedy algorithm to give a "still plausible" solution, which is the greedy algorithm in the reality of the existence of one of the significance

The basic algorithms of greedy-known greedy algorithms include:

Dijskstr single source graph shortest path algorithm, prim and Kruskal minimum spanning tree algorithm, Huffman coding simple compression algorithm, etc.

If the greedy algorithm is described in an abstract way, I think it can be said:

Suppose that some objects of the set S, each object x corresponds to a profit payoff (x), for any subset of S of T, we have a function to determine whether it is legal isValid (t)-it returns a Boolean value

And this function usually has a nature, the empty set is legal;

If T is legal, any subset of it is legal;

If it's illegal, any superset of it is illegal.

Our goal is to select several objects from S to form a set V, making IsValid (v) = = true and payoff (v) as large as possible

where Payoff (v) is defined as the sum of the proceeds of all objects in V

The greedy algorithm solves this problem, starting with an empty set, selecting a payoff maximum and valid object x at a time to join V, v = vu{x}


It can be seen that the problem with these properties is actually quite special, and the above properties are usually greedy selectivity

Visible greedy choice is relatively "short-sighted", select an optimal element, even if there are more than one, choose a

The dynamic programming algorithm is selected from all States and decisions that can reach the current state.

So, in a way, dynamic programming is an enumeration--just a clever enumeration, which enumerates all States and the decisions in that state.

But greed is only a single choice, blindly choose the optimal decision of the present

The comparison of greedy and dynamic programming algorithms can be seen in the following table:


_prim algorithm of 51Nod greedy classic algorithm

The prim algorithm of minimum spanning tree is also a classic application of greedy algorithm

Prim algorithm is characterized by the maintenance of a tree at all times, the algorithm constantly add edge, the process of adding is always a tree

Prim algorithm Process:

One side and one frontier to maintain a tree

Initial E = {} empty collection, V = {any node}

Cycle (n–1) times, select one edge at a time (V1,V2), satisfy: V1 belongs to V, V2 does not belong to V. and (V1,V2) the minimum weight value

E = e + (V1,V2)
v = v + v2

Finally, the edge in E is a minimal spanning tree, and V contains all the nodes


The example introduces the execution process of prim algorithm


The process of the prim algorithm starts with a V = {a}, E = {}


Select Edge af, V = {A, F}, E = {(a,f)}

Select Side FB, V = {A, F, B}, E = {(a,f), (F,B)}


Select Edge bd, V = {A, B, F, D}, E = {(a,f), (F,b), (B,D)}

Select Edge de, V = {A, B, F, D, e}, E = {(a,f), (F,b), (B,d), (D,e)}

Selected side BC, V = {A, B, F, D, E, c}, E = {(a,f), (F,b), (B,d), (D,e), (B,C)}, end of algorithm


Proof of the PRIM algorithm:

Assuming that the prim algorithm gets a tree p, there is a minimum spanning tree t

Assuming that P and T are different, we assume that the prim algorithm is in t when it is selected at step (k–1), when the tree of the prim algorithm is P '

At step K, the prim algorithm chooses an edge E = (U, v) not in t

Suppose U is in P ', and V is not

Because T is a tree, there must be a path from u to V in t

We consider that the first point on this path u in P ', the last point v is not in P ', then there must be an edge F = (x, y) on the path, × in P ', and Y is not in P '
We consider the relationship between F and E's Benquan W (f) and W (e):

If W (f) > W (e), in T with E to replace F (T with E minus f), get a weight and a smaller spanning tree, and T is the smallest spanning tree contradiction
If W (f) < W (e), the prim algorithm should consider adding edge F, rather than e, at step K,

So only W (f) = W (e), we replace F with E in T, so that the prim algorithm in the first K-step selection of the edge in T, the finite step after the T into P, and the tree weights and invariant, thus the prim algorithm is correct

Please understand the prim algorithm carefully--maintain a spanning tree at all times

The

Our proof is a constructive proof that all the minimum spanning tree edge weights (multiple) sets are the same!


Input
Line 1th: 2 number n,m The middle is separated by a space, N is the number of points, and M is the number of edges. (2 <= N <=, 1 <= M <= 50000)
2-m + 1 lines: 3 digits per line s E W, representing 2 vertices and weights for M-bars, respectively. (1 <= S, E <= n,1 <= W <= 10000) The

Output
outputs the sum of weights for all edges of the minimum spanning tree.

Input Example
9
1 2 4
2 3 8
3 4 7
4 5 9
5 6
6 7 2
7 8 1
8 9 7
2 8 all
3 9 2
7 9 6< Br>3 6 4
4 6
1 8 8

Output Example
PNS

#include <iostream> #include <cstdio> #include <cstring> #include <cstdlib> #include < algorithm> #include <queue> #include <vector> #include <stack>using namespace Std;typedef long Long Ll;typedef unsigned long long ull;typedef unsigned int uint;const int INF = 0x3f3f3f3f;const int MAXN = 1e3 + 10;int G[max N][maxn];int lowcost[maxn];bool vis[maxn];int n;int Prim (void); int main () {#ifdef __air_h freopen ("In.txt", "R", stdin);    #endif//__air_h int M;    scanf ("%d%d", &n, &m);  for (int i = 1, i <= N; ++i) {for (int j = 1; j <= N; ++j) {if (i = = j) {G[i][j]            = 0;            } else {g[i][j] = INF;    }}} int S, E, W;        for (int i = 0; i < M; ++i) {scanf ("%d%d%d", &s, &e, &w);    G[s][e] = g[e][s] = W;    } printf ("%d\n", Prim ()); return 0;}    int Prim (void) {int sum = 0;    Memset (Vis, false, sizeof (VIS)); for (int i =2; I <= N;    ++i) {Lowcost[i] = G[1][i];        } for (int i = 0; i < N-1; ++i) {int Min = INF;        int t; for (int j = 2; J <= N; ++j) {if (!vis[j] && lowcost[j] < min) {min = Lowcost[j]                ;            t = j;        }} if (Min = = INF) {break;        } sum + = Min;        Vis[t] = true;  for (int j = 2; J <= N; ++j) {if (!vis[j] && lowcost[j] > G[t][j]) {Lowcost[j] =            G[T][J]; }}} return sum;}


_kruskal algorithm of 51Nod greedy classic algorithm

Efficient implementation of the Kruskal algorithm requires a structure called a set

We do not introduce and check the set here, only introduce the basic idea and proof of the Kruskal algorithm, the implementation will stay in the future discussion

Kruskal the process of the algorithm:

(1) Sort all edges by weight from small to large
(2) Consider each edge in order (the Order of the edge from small to large), as long as this edge and the edge we have chosen do not constitute a circle, keep this side, otherwise give up this side

The algorithm succeeds in selecting (N-1) the edge, forming a minimum spanning tree, of course, if the algorithm cannot select the (n-1) edge, then the original image is not connected


Consider the example:


The edges are sorted after:
1 AF 1
2 DE 4
3 BD 5
4 BC 6
5 CD 10
6 BF 11
7 DF 14
8 AE 16
9 AB 17
Ten EF 33












Greedy Law _2 (draft)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.