1. What is the data structure:

There have been many different interpretations of the concept of data structure, and there are also many different controversies. This only represents my personal understanding.

Data Structure: how can we store a large number of complex problems in reality in a specific data type and storage structure to the primary memory (memory), and on this basis to implement

Functions (such as searching for an element and deleting an element). The corresponding operation is also called an algorithm.

1. The algorithm is implemented based on different data structures.

2. algorithms: describes the specific steps for solving a specific problem. The following features are: poor, deterministic, feasible, input, and output;

Ii. time complexity and space complexity:

(1) Time Frequency the time required for executing an algorithm cannot be calculated theoretically. You must run the test on the computer before you can understand it. However, it is impossible and unnecessary for us to test each algorithm on the machine. We only need to know which algorithm spends more time and which algorithm takes less time. In addition, the time spent by an algorithm is proportional to the number of statements executed in the algorithm. In an algorithm, when the number of statements executed is large, it takes more time. The number of statement executions in an algorithm is called the statement frequency or time frequency. As T (n ).

(2) time complexity in the Time Frequency just mentioned, n is called the scale of the problem. When n is constantly changing, the time frequency T (n) will also change. But sometimes we want to know what the rule is when it changes. Therefore, we introduce the concept of time complexity. In general, the number of repeated executions of the basic operation in an algorithm is a function of the problem scale N. It is represented by T (N). If an auxiliary function f (n) exists ), so that when n approaches infinity, the limit value of T (N)/F (n) is a constant not equal to zero, then f (n) is T (N). It is recorded as T (n) = O (f (n), and O (f (N) is the progressive time complexity of the algorithm.

In different algorithms, if the number of statement executions in the algorithm is a constant, the time complexity is O (1). In addition, when the time frequency is different, the time complexity may be the same, for example, T (n) = n2 + 3N + 4 and T (n) = 4n2 + 2n + 1 have different frequencies, but the time complexity is the same, all are O (n2 ). In ascending order of magnitude, common time Complexities include: constant order O (1), logarithm order o (log2n), linear order O (N), linear logarithm order o (nlog2n ), square order O (n2), cubic order o (N3 ),..., K to the power of O (NK), exponential order o (2n ). As the problem scale N increases, the time complexity increases and the algorithm execution efficiency decreases. 2. The space complexity is similar to the time complexity. The space complexity refers to the measurement of the storage space required for an algorithm to be executed in a computer. Note: S (n) = O (f (N) We generally discuss the scale of auxiliary storage units except for normal memory usage. The discussion method is similar to the time complexity.

(3) Evaluation of progressive time complexity the time performance of an algorithm mainly evaluates the time performance of an algorithm by the magnitude of the algorithm's time complexity (that is, the approximate time complexity of the algorithm.

(4) space complexity is a measure of the size of storage space temporarily occupied by an algorithm during operation. The storage space occupied by an algorithm in computer memory, including the storage space occupied by the storage algorithm itself, the storage space occupied by the input and output data of the algorithm and the storage space temporarily occupied by the algorithm during running. The storage space occupied by the input and output data of an algorithm is determined by the problem to be solved. It is transmitted by calling a function through a parameter table, which does not change with the algorithm. The storage space occupied by the storage algorithm itself is proportional to the length of the algorithm writing. to compress the storage space, you must compile a short algorithm. The storage space temporarily occupied by an algorithm varies with the algorithm. Some algorithms only occupy a small amount of temporary work units and do not change with the size of the problem, we call this algorithm "in-place \" and save storage, as is described in this section; the number of temporary work orders that some algorithms need to occupy is related to the scale of Solving the Problem n. It increases with N. When n is large, it will occupy a large number of storage units, for example, the quick sort and merge sort algorithms described in Chapter 9 belong to this situation.

(5) If the spatial complexity of an algorithm is a constant, that is, it does not change with the size of the processed data N, it can be expressed as O (1 ); when the spatial complexity of an algorithm is proportional to the logarithm of N at the bottom of 2, it can be expressed as 0 (10g2n ); when the complexity of an algorithm's null I division is linearly proportional to N, it can be expressed as 0 (n ). if the shape parameter is an array, you only need to allocate a space for storing an address pointer transmitted by the real parameter, that is, a machine long space. If the shape parameter is a reference method, you only need to allocate an address space for it to store the address corresponding to the real variable, so that the system can automatically reference the real variable.

(6) The following describes the time and space complexity of common algorithms:

1. Concept of Data Structure, time complexity and space complexity