About this article believe that many people are on the highway on the holiday encountered big congestion, but eventually congestion will be lifted. There are also questions about the length of the router queue, which will deny the service to the final router. I had been naïve 10 years ago to think how easy it was for designers of highways and designers of router switches to work. Now, however, when I know more, I find that it is not. More trade-offs and games are needed, not only in terms of technology, but also in psychology, sociology, and economics.
Therefore, the purpose of this paper is to use the simplest description to analyze the queuing theory of expressway and packet switching network guidance. There is no complicated mathematical deduction, this deduction should be done by yourself, or please recite the relevant chapters of the university probability theory textbook, if you are not interested, please remember the conclusion, if you have doubts-as I do, please elaborate.
Queuing theory overview The theory behind queuing is queuing theory ... Queuing theory is the core of packet exchange. Not only that, it is also the core of all the queuing scenarios (payment, banking services, etc.) that are involved in the construction of highways. Anyway, it's very important. In fact it's very simple. Referring to a very good article, "A Dash of queueing theory", very simply describes what most people think of as a very complicated queuing theory, so I'll write a review on the weekends. By the way, I have a personal understanding of packet switching and statistical multiplexing.
The theory behind packet switching is that queuing theory is feasible. In fact, early in the packet exchange has not yet, queuing theory has long been applied for several hundred years or even thousands of year, packet switching network only in this fact is theoretical and coincided with the development of the network spring, the two marriage.
Another core of packet switching is statistical multiplexing. In fact, as early as the packet exchange has not yet, statistical multiplexing has existed for thousands of years or even tens of millions of years. The world in which we live is statistical multiplexing, roads, land, and public facilities, all of which are statistically multiplexed.
The theory behind the fact that statistical multiplexing can be accepted is still the queuing theory, that is, queuing fairness. Of course, in the early days of history, the absence of statistical multiplexing led to unfairness, resulting in a butterfly effect, followed by kings, empires, and rulers. However, this is not discussed in this article.
The task (packet, vehicle) queuing process and the process of being serviced by time separately we put a separate queueing process in time, and we get the following diagram (the queue rate is fixed):
Well, if we start the process of consuming the output of an entity in a consumer queue in time, we get the following diagram:
Tasks (packets, vehicles) The queued process and the process of being serviced are merged if the queued entities are queued continuously, the queue becomes infinitely long. In fact, no queue is infinitely long, otherwise why do you queue up to wait for an eternity of unacceptable service .... The abandoned railway station in the novel "Ten Years" does not exist in reality. Why? As long as there is a queue, there will be service processing team head, that is, queued at the end of the team, service in the team head, this is a simple first-out process.
So, if we merge the above two graphs, we will find that the band in the middle of the combined two curves is the queue, as shown in:
Queue analysis of tasks under fixed rate Input 1. The case that the task input rate and the absorption rate of the service are equal this situation, starting from the initial state, is not a queueing phenomenon, such as the unit time to reach three people need services, and the service desk has three attendants, ignoring the assumption of service time, every unit of time will always have three waiter. However, once other factors, such as dealing with someone's business too long (service time is not negligible), will create a queueing phenomenon, once a service is delayed, the overall slow service output rate, which is the main cause of queuing.
2. If the service output rate is greater than the queue input rate and the service's absorption rate is not equal, then the queue will eventually disappear, and if it lasts less than the queue input rate, the queue will grow longer, which is common sense, but these are the basis for in-depth analysis.
3. Conclusion we can see from the analysis, the increase in service resources a little bit, do not need to increase too much, we have to achieve the effect is, the deviation of an angle, so that the service output curve and queue input curve final intersection can be:
Reality: The analysis and illustration of the input Poisson distribution and the exponential distribution of input intervals, let us feel how unstable the queuing system is, or the queue becomes 0 or continues to grow. Holding this view will certainly question the availability of the router. But in fact, whether in reality, such as highway congestion, snapping up items, buying tickets or the internet world, such as routers, switches, servers, we have not seen the queue quickly become 0 or continue to grow, the queue is always fluctuating, queued entities unless they give up, or will always get services, The queue has never been infinitely extended. Why is this?
Because the queue rate is not a constant, but in line with the Poisson distribution of a range, plainly, the queue rate has an average, deviate from the average of the more the queue rate, the more impossible to see. For example, a queue rate average is 10, then the unit time Queueing 10 entities is the most likely, queued 9 or 11 is the second possible, 8 or 12, 7 or 13, it is possible, ... Then 1 or 19 are less likely to be relative to the front.
Due to the Poisson distribution of the queuing rate, the queuing interval of the adjacent two queued entities should also have a distribution law, which can be deduced from the Poisson distribution, because mathematics is just a tool, I do not post the derivation process, directly post the answer, the queuing interval of the adjacent queued entities in line with the exponential distribution, meaning that, The next queue is most likely to arrive in the shortest time possible, for example, in 1 minutes, if 5 minutes have not come, you expect it to come in 15 minutes of hope is also relatively slim, this is also called "Next right to" law. This is very common sense, such as when you are waiting for someone, if he is 20 minutes late, then he will probably not come, such as interview, if you have not returned home after the interview, the company just called, then you are likely to be admitted.
This is the real queueing scenario under Poisson distribution and exponential distribution.
Queued analysis with the task input rate in accordance with the Poisson distribution the real queuing scenario is as follows:
Take a long look at the picture, you will ignore the slightly curved details, the entire input curve is a straight line, in fact, the service output curve is the same, as the fractal theory illustrates. There are two points to keep in mind:
a). In most cases, the input rate is the average input rate, at least to the average input rate convergence (Poisson distribution has worked);
b). In most cases, the next queue will arrive immediately (the exponential distribution is in effect), subject to a.
Queuing congestion and mitigation in expressway queue Congestion Example 1. One-lane situation in front of the car stopped, the rear of the vehicle will not pass, after all, they can not fly. Then will cause the queue, the front car according to the fixed rate of parking between the line, the fleet of vehicles, one after another out of the team, but the queue does not disappear quickly, because there are many vehicles at the end of the team in succession, if the queue rate and the same rate of the team, the queues will never disappear, if the team rate is smaller The queue will never disappear, it will continue to grow longer, and if the outbound rate is greater than the queue rate, the queues eventually disappear.
2. Two lanes do not consider the inlet, the variant situation this situation can hardly be considered, because it is in fact impossible to occur in a statistical multiplexing channel. This is the case of a "team head congestion" Lane with a normal non-lined lane without interference overlay, two lanes do not affect each other, one lane completely stagnant, the other lane full speed, you have seen this situation? Anyway, I'm not.
Why is this?
One of the prerequisites for the establishment of a statistical multiplexing network or channel is queuing fairness! This is at the core of the core, and precisely because of this principle, packet switching network only has the theoretical rationality, only has the usability. Consider a circuit-switched network or a train, the channel is proprietary, the channel exists in the meantime, even if it is idle can not be borrowed by others. Each communication entity to use the channel, must apply for a dedicated, their own private, and later in view of the conservation of resources, with a variety of reuse, however, the granularity of these multiplexing is still very coarse, there will still be idle gap. Take the train TDM on Jingguang line, for example, although many trains share the railroad line at different times, but you still see a lot of time, there is no car traffic on the railway, this strict time-slot multiplexing is strict and requires a considerable clock synchronization mechanism, so the gap between the two adjacent slots is long enough and costly, Or for a place where transportation is not fixed, you cannot guarantee that the time gap T given to truck a today is bound to be used by truck A, as it may not have a shipping mission today. If you want to divide the time slot T to others, you need a complex scheduling mechanism ... What is the highest and most reasonable plan? is to eliminate waste completely. Reuse granularity is further reduced, eventually eliminating the central control, into a complete free market, by the communication entities themselves according to their own needs to determine the reuse process, which is statistical multiplexing.
No rule is the best rule!
The most important rule is to queue, cut, Jianfengchazhen. This exists, that is, reasonable, because all the reuse must have a premise, that is fairness, the full private channel, TDM,FDM will not cause queuing, because for multiple communication entities, their channels are physically or logically separate, but for statistical multiplexing, the channel is completely mixed, Fairness needs to be built on its own. Unimpeded, everyone is fair, the unfair moment is always in the queue, congestion situation will be perceived, so need a queuing time also need fairness mechanism, this mechanism is to skip, change the way, that is, build a virtual output queue (VOQ). Queuing lanes in the queue of vehicles to have this right, because they and the normal lane of the normal driving vehicle is completely equal, is the team head caused by queuing congestion, unrelated to the queue, queuing is innocent, reflects the unfair, in order to take fair measures, can only pull down the smooth quality of all lanes.
3. Why congestion spreads quickly after you understand the above analysis, you will see that the next queue will come immediately, whether it's the expectation of Poisson distribution or the edge of the Poisson distribution, in the exponential distribution. The next queue is always the most likely to arrive in the shortest time interval, which is the root of the rapid spread. Comparing the above illustration, you can see that the strip area between the two curves increases the area rapidly, causing local congestion to cause global congestion.
4. The real situation: universal queuing, large-area congestion above said why in the highway queuing congestion when it is impossible not to stop, then what is the real situation? Do not say everyone also understand, sometimes on the highway blocked for an hour, slowly forward, to the front found that the farthest from their own driveway two cars slightly rub ... One-way 4-lane wide road, the most outer lane of the accident how to affect the innermost lane, how can bring a large area, may be super dozens of km queue length?
In front of the situation, is not allowed to skip, change the way of the situation, so it only affects a lane, that is, causing a lane congestion. The real situation is that the cars on this line of driveway will certainly find this unfair--which is mentioned earlier, in the router called "Team head congestion (Hol,head of line)", because the vehicles and packets are different, they are self-routing, so the line of vehicles in the queuing lane began to build their own force VOQ, That is, the virtual output queue, plainly is the plug, change the road, change the lane to not queue, the latter will lead to further VOQ self-key process, chain reaction, so that the queue congestion lane traffic is evenly divided into the normal lane, so the normal lane input rate exceeds the conventional. Raises the normal lane on the same line that appears ...
This situation and TCP/IP network is exactly the same situation, that is, whether for the vehicle or the packet, the network is a statistical multiplexing, there is no strict rules, such as strict TDM,FDM, such as strictly speaking, the point is Jianfengchazhen, the channel as long as empty, you do not need to use, remember CSMA/CD? That's basically it. When a car and front car are more than a safe distance away, there will be a car in the next lane trying to tap, he will first come up with a "carrier monitoring, collision detection", such as flashing lights, to ensure that their behavior is known to the car, and so on. It's not a good gentleman, but it's the essence of all the statistical multiplexing channels. This is a risk-seeking efficiency of the process, it can also be said to be a mutually beneficial situation, the conflict in the statistical reuse of unspoken rules (if there is no potential rules, you can imagine in the congested city center, all kinds of box trucks, cars, pedestrians, car, after all, there is no rub, you have to know the rearview mirror is a dead angle And drivers are not necessarily able to grasp their own car's length and width of the torque ...) The impact is always local, a small number of events, so it is worth the risk, but once a conflict, such as rubbing, accidents, etc., will cause a large area of congestion, which we have mentioned earlier, then the next thing to say is how this queuing congestion is mitigated? It will certainly be alleviated, if not so, this statistical multiplexing network is completely unusable! And then read on.
How to alleviate the congestion on the freeway we know that the road will be congested, but this congestion will always be alleviated, in accordance with the ideal situation, to arrive at a fixed input rate and output at a fixed output rate of vehicles once queued, the queue will always remain, which is the theoretical analysis of the above conclusions, but the real situation is that Although the output rate is fixed at the moment of congestion mitigation, the input rate is not fixed, and the Poisson distribution of the input rate causes congestion to eventually ease.
Admittedly, adding a lane is equivalent to adding a little bit of service resources. According to our previous discussion, consider the fixed rate of input and fixed service rate, that is, fixed service resources, if the input curve and output service curve are parallel, the queue caused by congestion will always exist, and the queue length is constant, but if you add even one lane, the output service curve will be steeper, It will eventually intersect the input curve, leaving the queue to disappear. It is a kind of horseshoes, great difference effect.
Of course, considering the real situation is undoubtedly more complex, to take into account the break, change road and other uncivilized behavior caused by the congestion further intensified, of course, this is a bad aspect. So in the real case, the good thing is, the input curve is not a straight line, its slope is variable, how to change it? In simple terms, it is the trend of this curve or the direction of the fixed input rate, but the protocol of each point is always in line with the Poisson distribution, and its expectation is the slope of the fixed input rate! Let's take a look at how this Poisson distribution helped us out of the mire when the output service was restored. In two different cases.
Scenario 1: Output Service paused-for example, two car scratches or a traffic accident that caused the driveway to be closed.
When the accident is not clear, the Poisson distribution to congestion is not effective, in any case, the arrival of the vehicle is only longer queuing queue, just said because the vehicle arrival rate in line with the Poisson distribution, sometimes come to a little more cars, sometimes less, anyway, the queue length is increased, just increase the speed is different.
After the accident was cleared, the traffic of the accident point was completely restored, the first car in the queue started off, followed by the second one, then the third ... At this point the car departure rate is fixed, and is the full rate, as if the vehicle passing rate at this location reached the minimum probability of the Poisson distribution of the maximum value, the queue at the end of the vehicle arrival rate is still in line with the standard Poisson distribution. Before the queue disappears, the traffic rate of the accident point namely the queue Head is fixed, and the vehicle arrival rate is greater than the tail of the queue, and the queue disappears.
If the vehicle arrival rate at the end of the queue is equal to the vehicle pass rate on the queue head, the queue will be maintained forever!
Scenario 2: Output Service delay-for example, a toll station is encountered
Taking into account the number of lanes and the amount of charge windows equal, according to the previous analysis, because the toll station is there, and will never disappear, so it caused the queue congestion will "never return to normal", so the explanation of situation 1 can not be used in case 2, But we still see the fact that the highway is not on holiday when the toll station and the large area of non-recoverable queuing congestion, this is why?
Because the number of charging Windows is greater than the number of lanes, although it increases the delay of a single car, but it can handle multiple vehicles in parallel, so the total throughput (or line rate capability) has not changed, overall, queuing congestion or eased.
This situation 2 noteworthy, this is the practice of routers!!
At the end of this section, let's look at a basic fact. Shanghai Jia High-speed (S5 high-speed), in 2012 before and after the demolition of the toll station, in 2013 and around the demolition of the central isolated green belt, with both sides of the hard shoulder to add a one-way lane. Think about what this is about? Why only add a lane in one direction instead of two, why the simple matter of splitting up a toll station and adding a driveway has brought about a big increase in capacity. Through the above analysis should be able to get the answer, note, do not just from the charge or not to consider the interests of the owner.
There is also a study questions, that is the design of the ancient Roman road system, but this is not the scope of the queue congestion problem, but the category of connectivity issues, how connectivity brings income exponentially multiplied?
Highway toll station holiday free for another reason
It is not just the problem of money, but the toll station itself is unreasonable. I say a similar, is the large traffic, the backbone of the router will eliminate some of the traffic audit policy, leaving the authentication of trusted BGP at the other end, into the IGP domain inside the edge router to complete this audit function, is not the same as the holiday cancellation charges? Is it capacity, not because you want to save money, it is not saved, all car travel, all to the tourist attractions ...
Highways, like routers, at the beginning of the design, its average capacity, maximum capacity, the worst queuing delay, toll station time delay, the degree of parallelism are complex mathematical calculation, it inevitably involves Poisson distribution, exponential distribution, in addition, psychological, climatic factors, etc. also have a very heavy weight, And this calculation is not for the burst, that is, any organization will not be the highway, router buffer designed to be a burst of traffic-based system, so that the high cost of not a substantial benefit, but if it is true, it is really the serving. This queueing system is designed only for the average capacity, the total capacity is slightly larger than the average capacity, in order to deal with unpredictable small bursts, more time, completely rely on the following
Adventure hypothesisTo relieve queued congestion:
The input curve is a winding curve, the total trend is a straight line, its slope is the input rate expectations, and most of the time, the output service rate as long as the input is slightly larger than expected, to achieve most of the time, the output service curve can catch up the input curve, In most cases, you can intersect the two curves!
The size of the router buffer and the setting of the scheduling algorithm after analyzing so much, did you suddenly find out how many buffers-the queued area-are not easy to set for the router. Because you have to anticipate the average flow, minimum flow, minimum flow duration, maximum burst, burst duration, burst time period, and then make a decision based on the quality of service-decomposition for the input rate and output rate, the cost of the saddle face that can sit in the position of the person. In short, the router's buffer may be dynamic changes, which is particularly complex mathematical calculations, it is a game, so in addition to queuing theory, you have to understand game theory.
The settings for the buffer are for input only, and for the output, there is a scheduling algorithm setting. Really, routers are much more complicated than highways, because the vehicles on the freeway are self-routing, self-built output queues (through turn lights, switch lanes, etc.), while routers are completely blind-navigated by the algorithm inside the router.
However, I really hope that the highway example can make people understand the nature of routers more, together with packet switching and the nature of statistical multiplexing.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Discussion on congestion and congestion mitigation of routers and highways based on queuing theory