The interrelationships between data are called logical structures . Generally divided into four basic structures:
Data elements in a collection structure are in addition to one type. There is no other relationship.
There is a one-to-one relationship between data elements in a linear structure .
There is a one-to-many relationship between the data elements in a tree-structured structure.
There are many-to-many relationships between data elements in a graph structure or a mesh structure.
The data structure has two different storage methods in the computer:
Sequential Storage structure : Represents the logical relationship between data elements in the relative position of data elements in memory.
chained storage structure : Adds a pointer to the address in each data element that represents the logical relationship between data elements.
Complexity of Time
An algorithm spends more time than the number of statements in the algorithm, which is more than the number of statements run in the algorithm, it takes more time.
The number of statement runs in an algorithm is called the statement frequency or time frequency .
Remember as T (N)
In the frequency of the time just mentioned. n is called the scale of the problem, and when N is constantly changing, the time frequency t (n) also changes constantly.
But sometimes we want to know what the pattern is when it changes.
To do this, we introduce the concept of time complexity.
Under normal circumstances, the number of times the basic operation of the algorithm is a function of the problem scale n, denoted by T (n), if there is an auxiliary function f (n), so that when n approaches infinity, the limit value of T (n)/f (n) is not equal to zero, then f (n) is the same order of magnitude of T (N) As T (n) =o (f (n)), called O (f (n)) is the progressive time complexity of the algorithm, which is referred to as the complexity of time.
Sometimes, the number of times the basic operation of the algorithm runs repeatedly varies depending on the input dataset of the problem. In a bubbling sort, the input data is ordered and unordered, and the result is different. At this point, we calculate the average.
The relationship between the time complexity of the common algorithm is:
O (1) <o (LOGN) <o (n) <o (nlog N) <o (n2) <o (2n) <o (n!) <o (NN)
Example 1
sum=0; //(1) for (i=1;i<=n;i++) //(2) for (j=1;j<=n;j++) //(3) sum++; //(4)
Statement (1) runs 1 times,
Statement (2) Run n times
Statements (3) run N2 times
Statements (4) Run N2 times
T (n) = 1+n+2n2= O (n2)
Example 2
a=0; b=1; (1) for (i=1;i<=n;i++) //(2) { s=a+b; //(3) b=a; //(4) a=s; //(5) }
Statement (1) runs 1 times,
Statement (2) Run n times
Statements (3), (4), (5) Run n times
T (n) = 1+4n =o (n)
Example 3
I=1; (1) while (i<=n) i=i*2; (2)
The frequency of the statement (1) is 1,
The frequency of setting statement 2 is f (n), then: 2f (N) <=n;f (n) <=log2n
The maximum value f (n) = log2n,
T (n) =o (log2n)
Complexity of space
Spatial complexity: The measurement of the storage space required by the algorithm is recorded as:
S (n) =o (f (n))
n is the size of the problem.
The storage space occupied by an algorithm in the computer memory contains the storage space occupied by the storage algorithm itself, and the storage space occupied by the input and output data of the algorithm and the storage space occupied by the algorithm in the running process are three aspects. Assuming that the extra space is a constant relative to the amount of input data, the algorithm is said to be working in situ.
The storage space occupied by the input and output data of the algorithm is determined by the problem to be solved, which is passed by the calling function through the tables. It does not vary with this algorithm. Storage algorithm itself occupies the storage space and the length of the algorithm written in proportion, to compress the storage space, you must write a shorter algorithm.