Turn from: Worst case and average situation of the algorithm
If a program runs multiple times, sometimes it will be faster, sometimes it will slow down a bit. The algorithm is the same, in the case of input 1 and input 2, the execution efficiency is not necessarily the same. That is, the algorithm will vary in order and efficiency depending on the input data, sometimes faster and sometimes slower. For example, it is relatively easy to sort an already sequenced sequence. In addition, the size of the input size also affects the run time of the algorithm. For example, a short sequence is easier to sort than a very long sequence.
In general, we want to get a lower limit of the time efficiency of an algorithm because everyone likes some kind of guarantee: the algorithm is no less efficient than we guarantee. This analysis is the so-called worst case analysis. Worst case analysis refers to the lower limit of the efficiency of an algorithm in the case of a given input size.
- For example, the morning after the door suddenly remembered, the phone forgot to bring, this year, the key, purse, mobile phone Big Three, out of which can not be less ah. So I went home to find. Open the door to see, the phone is in the doorway porch, the original is out of shoes when forget to take. This is certainly better, basically did not spend any time looking for. But if not put in there, you have to go everywhere, find the living room to find the bedroom, find the bedroom to find the kitchen, find the kitchen to find the bathroom, is not found, time a second of the past, you suddenly remembered, you can use home landline phone call, listen to the phone ringtone to find ah, really stupid. Finally found, under the pillow under the bed. You go to work again, late. Heck, this year's attendance award, just because the phone to yellow.
When looking for something with good luck, there is also a situation where you can't find it. But in reality, the vast majority of our encounters are neither the best nor the worst, so the average situation is mostly.
The complexity of the algorithm (algorithms) (complexity) is the resource (time or space) required to run an algorithm. The same algorithm can handle different input data consumption of resources may also be different, so when analyzing the complexity of an algorithm, there are three main cases to consider, the worst case (worst case), the average situation (Average case), the best cases (the preferred case).
The analysis of the algorithm is similar, we look for a number in an array of n random numbers, the best case is the first number is, then the time complexity of the algorithm is O (1), but it is possible that the number is in the last position to stay, then the algorithm's time complexity is O (n), this is the worst case.
The worst-case run time is a guarantee that the run time will not be broken again. In the application, this is one of the most important requirements, usually, unless specifically specified, the runtime that we refer to is the worst-case run time.
And the average run time is the probability of the point of view, this number in each position is the same probability, so the average search time is N/2 times after the target element is found. The average situation is more reflective of the performance of the algorithm in most cases. The average case analysis is the input of all input sizes N, allowing the algorithm to run over and then take their average. Of course, it is not possible to run all possible inputs in practice, so the average situation usually refers to a mathematical expectation, and the computational mathematical expectation requires assumptions about the distribution of the inputs.
The average run time is the most meaningful in all cases because it is the desired run time. In other words, when we run a program code, we want to see the average run time. In reality, the average running time is difficult to get through the analysis, usually by running a certain number of experimental data to be estimated.
Sometimes we need to know what the best thing is, and there are two levels of significance: first, we want to know how good luck can be, and if we can prove good luck with us, we need to know how the algorithm behaves when luck is good. The best analysis is to see which input will make the algorithm run most efficiently given the size of the input. Of course, some people think that the best case analysis is a bit false: we can manipulate the input to make a very slow algorithm to act quickly, so as to achieve the effect of deception.
For the analysis of the algorithm, one method is to calculate the average of all cases, and the computational method of time complexity is called the average time complexity. The other approach is to calculate the worst-case time complexity, which is called the worst time complexity. Generally, in the absence of special instructions, it refers to the worst-case complexity of time.
- I asked the teacher before, why analyze the worst-case algorithmic time complexity? As a result, the teacher's answer is that the procedure is to see the worst time, and the worst time is easier to calculate.
- Well, that's a reason. There are probably some of the following reasons:
- The worst-case complexity is the maximum resource consumed by all possible input data, and if the worst-case complexity meets our requirements, we can guarantee that there will be no problem in all cases.
- Some algorithms often encounter worst-case scenarios. For example, a lookup algorithm, often need to find a value that does not exist.
- You may think that the average complexity is more appealing to you, but there are a few problems with the average situation. First, difficult to calculate, most of the worst-case complexity of the algorithm is much easier to calculate than the average case, second, there are many algorithms, the average situation and the worst-case complexity is the same. Third, what is the true average? It is also unreasonable if you assume that the probabilities of all possible input data appear the same. In fact, most of the situation is not the same. and the distribution function of the input data is probably beyond your knowledge.
- It is meaningless to consider the complexity of the best case. Almost all of the algorithms you can modify slightly to get the best possible complexity (see the structure of the input data, can be O (1)). How to modify it? Calculate the answer of an input in advance, determine the input at the beginning of the algorithm, and if so, give the answer.
Extended Reading
The list of topics for this article is as follows:
- The first: How did you learn the data structure?
- The second word: History and origins of data structure
- Third: Some concepts about the data structure
- Session Four: Logical Structure of data
- Session Five: The physical structure of the data
- Word six: About data types
- Seventh session: Abstract data type ADT
- Eighth session: The relationship between the basic concepts of supplementary data structures
- Nineth session: The relationship between data structure and algorithm
- 10th session: What is an algorithm?
- 11th session: Five basic features of the algorithm
- 12th words: What kind of algorithm is a good algorithm
- 13th session: Performance analysis of the algorithm
- 14th session: How to calculate the time complexity of the algorithm
- 15th session: Worst case and average of the algorithm
- 16th session: Spatial complexity of the algorithm
Turn: The worst case and average complexity of the algorithm depends on the worst case scenario