After the examination, I played the first CF game. It was terrible and I only made one question. I have been reading question a for a long time, and I have understood the suggestion of the author. It should be the best of both points. Otherwise, this question must be particularly difficult to answer, when YY is handed in, it will pass. Then I listened to the answer from Yingjie after the game and read the answer. I found that I caught the correct solution, but because I don't know how to calculate the expected complexity, I don't dare to think about it. Where can I think that computing complexity will count as expected?-0
Then let's talk about the question C of the pitfall. During the competition, I thought it was a line segment tree. It supports more segments, but each segment is more specific to each point, updated difference | x-y | to be recorded, I thought that if I used a line segment tree, it would inevitably be TLE. After the game, I thought it was not O (N ^ 2)... After learning it, I realized that a single operation may be O (N), but it can be O (1.
The reason why the single complexity reaches O (n) is that the following clear function, if the range of the clear itself is not a piece (that is, cov =-1 ), it is necessary to recursion this interval once or right until there is a link in the next recursive interval to stop, the complexity of clearing an interval depends on the number of discontinuous intervals in the interval. At the beginning, all intervals are discontinuous (1, 2, 3, 4... n). If I update 1 ~ At the beginning ~ N, so the complexity is O (n), because 1 ~ The intervals below N have n discontinuous intervals, but after this operation, the discontinuous intervals become 1. It is not difficult to find that each update of a segment produces a maximum of two discontinuous intervals. Therefore, after M operations, the Discontinuous total number of intervals is controlled at N + 2 m (the maximum number of partial intervals is N, as in the beginning ), therefore, the complexity of all the clear statements adds up to n + 2 m. Therefore, the complexity does not reach O (N ^ 2. After a long time, I learned why complexity can be controlled under O (nlogn.
#pragma warning(disable:4996)#include <cstdio>#include <algorithm>#include <iostream>#include <cstring>#include <string>#include <vector>#include <cmath>using namespace std;#define ll long long#define maxn 100500struct Node{int l, r;ll sum;ll delta;ll cov;}N[4*maxn];int n, m;void build(int i, int L, int R){N[i].l = L; N[i].r = R; N[i].sum = N[i].delta = 0; N[i].cov = -1;if (L >= R){N[i].cov = L;return;}int M = (L + R) >> 1;build(i << 1, L, M);build(i << 1 | 1, M + 1, R);}void clear(int i, int L, int R, int val){if (N[i].cov != -1){N[i].delta += abs(val - N[i].cov);N[i].sum += abs(val - N[i].cov)*(N[i].r - N[i].l + 1);}else{clear(i << 1, L, R, val);clear(i << 1 | 1, L, R, val);N[i].sum = N[i << 1].sum + N[i << 1 | 1].sum + 1LL * N[i].delta*(N[i].r - N[i].l + 1);}N[i].cov = -1;}void update(int i, int L, int R, int val){if (N[i].l == L&&N[i].r == R){clear(i, L, R, val);N[i].cov = val;return;}if (N[i].cov != -1){N[i << 1].cov = N[i << 1 | 1].cov = N[i].cov;N[i].cov = -1;}int M = (N[i].l + N[i].r) >> 1;if (R <= M) update(i << 1, L, R, val);else if (L > M) update(i << 1 | 1, L, R, val);else update(i << 1, L, M, val), update(i << 1 | 1, M + 1, R, val);N[i].cov = -1;N[i].sum = N[i << 1].sum + N[i << 1 | 1].sum + N[i].delta*(N[i].r - N[i].l + 1);}ll query(int i, int L, int R){if (N[i].l == L&&N[i].r == R){return N[i].sum;}int M = (N[i].l + N[i].r) >> 1;if (R <= M) return query(i << 1, L, R) + N[i].delta*(R - L + 1);else if (L > M) return query(i << 1 | 1, L, R) + N[i].delta*(R - L + 1);else return query(i << 1, L, M) + query(i << 1 | 1, M + 1, R) + N[i].delta*(R - L + 1);}int main(){while (cin >> n >> m){build(1, 1, n);int t, l, r, x;for (int i = 0; i < m; i++){scanf("%d%d%d", &t, &l, &r);if (t == 1) {scanf("%d", &x);update(1, l, r, x);}else{printf("%I64d\n", query(1, l, r));}}}return 0;}