Search and sorting 01, linear search, time complexity, algorithm, 01 linear search

Source: Internet
Author: User

Search and sorting 01, linear search, time complexity, algorithm, 01 linear search

Linear search is definitely a linear query of an element in a set or array. This article includes:

 

  • Understanding linear search through code
  • Time Complexity
  • What is an algorithm?

 

Understanding linear search through code

What is "linear "? Let's try it out in the Code.

 

First, we need a set or array. How can we get it? Generate a random array with a fixed length. Enter a search key. If the key is found, the index of the element is returned. If the key is not found, the value-1 is returned. This is simple.

    class Program
    {
        private static int[] arr;
        private static Random r = new Random();
        static void Main(string[] args)
        {
            SeedData(10);
            ShowArray();
            Console.WriteLine("\n");
Console. WriteLine ("Enter the number of integer types to search ");
            var temp = Convert.ToInt32(Console.ReadLine());
Console. WriteLine ("the {0} location you are looking for is {1}", temp, LinearSearch (temp ));
            Console.ReadKey();
        }
// Linear search
        private static int LinearSearch(int key)
        {
            for (int i = 0; i < arr.Length; i++)
            {
If (arr [I] = key) return I; // if it is found, the index is returned.
            }
Return-1; // if not found,-1 is returned.
        }
// Seed data of the array
        private static void SeedData(int size)
        {
            arr = new int[size];
            for (int i = 0; i < size; i++)
            {
                arr[i] = r.Next(1, 100);
            }
        }
// Print all elements of the array
        private static void ShowArray()
        {
            foreach (var item in arr)
            {
                Console.Write(item + " ");
            }
        }
    }   

Above, we can define what is linear search. That is to say, when we enter a search key, each element in the set is searched in order (actually through loop traversal). If it cannot be found, a value is returned, such as-1, if it is found, the index location of the element is returned.

 

Time Complexity

Linear search is only the simplest Search algorithm, and there are other relatively complex algorithms. How can we measure the efficiency of various algorithms? The answer is "time complexity. Any concept comes from practice and is not created out of thin air. The same is true for "time complexity.

 

□O (1)

Assume that an array contains 10 elements and you need to compare whether the first element is equal to the second element. The algorithm only needs to run once to obtain the result. What if the array contains 100 elements? The result is obtained after the algorithm is run once. As a result, people think that the running of an algorithm has nothing to do with the size of an array. This algorithm is called "constant running time. But it is not intuitive enough. What image is used for representation? People use O (1) to express their ideas.

 

O (1) Although intuitive, it is easy to produce ambiguity. Some people think that only the algorithm runs once, and the number of running times does not change with the array size, it can be represented by O (1. This is obviously "Wang wenshengyi ". O (1) a more accurate explanation is that the number of times the algorithm runs is a constant and does not change with the array size.

 

□O (n)

The highlights of life come from diversity. Assuming that an array contains 10 elements, the algorithm needs to run nine times to compare whether the first element is equal to any other element in the array. If the array contains 100 elements, the algorithm needs to run 99 times, that is, n-1 times. The number of times the algorithm runs varies with n. People write this algorithm as O (n), 1 is negligible, and reads it as "n ".

 

□O (n²)

Assume there is another algorithm that needs to compare whether any element in the array is equal to other elements. The first element needs to be compared with n-1 elements, and the second element needs to be compared with the N-2 elements after the first element ...... that is: (n-1) + (n-2) +... + 2 + 1. This formula can be simply calculated using a pen on paper. It can also be extracted into n²/2-n/2. As n increases, constant factors can be ignored, written as O (n²), called "square running time", reads as "n² ".

 

When n is tens of thousands, the O (n²) algorithm does not have a significant performance impact on personal computers that run billions of times per second today. However, if n becomes several million, it will require several trillion computations, which may take several hours to execute. What's more, if n turns into billions, computing will take decades to complete. Therefore, each algorithm has its applicability.

 

Now, we can completely define "time complexity". It is a function that quantitatively describes the algorithm running time. Time complexity is the worst case.

Common time complexity:
● O (1), constant order, such as Hash table search
● O (log2n), logarithm level, such as binary search
● O (n), linear order
● O (nlog2n), linear logarithm order, such as average complexity of fast sorting
● O (n ^ 2), square level, such as Bubble Sorting
● O (n ^ 3), cubic level, such as the Floyd algorithm for finding the Shortest Path
● O (n ^ k), k to the power
● O (2 ^ n), exponential order, such as the tower

 

What is an algorithm?

Describes the steps for solving specific problems. A finite sequence of commands in a computer, and each command represents one or more operations.

sum = 1     +     2     +     3 + ... + 100
sum = 100   +     99  +   98+ ... + 1
2*sum = 101*100 = 10100
sum = 5050


The above is the algorithm for summation between 1 and 100. Yes, in the 18th century, a small German village and a child named Gauss came up with the magic!


Time complexity of various searches and sorting

Bubble Sorting is stable, and the algorithm time complexity is O (n ^ 2 ).

2.2 Selection Sort)

The basic idea of sorting is to process the order record sequence for n-1 times. The I-times processing is to swap the smallest person in L [I. n] with L [I. In this way, after I times, the position of the previous I record is correct.

The selected sorting is unstable, and the algorithm complexity is O (n ^ 2 ).

2.3 Insertion Sort)

The basic idea of insertion sorting is that after I-1 processing, L [1 .. I-1] has arranged the order. I-repeat only inserts L [I] into the proper position of L [1 .. I-1] so that L [1 .. I] is a sorted sequence. To achieve this goal, we can use the sequential comparison method. First compare L [I] and L [I-1], if L [I-1] ≤ L [I], then L [1 .. i] the order has been sorted, the I-th processing is over; otherwise, the position of switching L [I] and L [I-1] continues to compare L [I-1] and L [I-2], until a position j (1 ≤ j ≤ i-1) is found so that L [j] ≤ L [j + 1. Figure 1 demonstrates the insertion sorting process for the four elements, which requires three inserts: (a), (B), and (c.

Direct insertion and sorting are stable, and the algorithm time complexity is O (n ^ 2 ).

2.4 heap sorting

Heap sorting is A kind of tree-based sorting. During the sorting process, A [n] is considered as A Complete Binary Tree sequential storage structure, use the inner relationship between parent and child nodes in A Complete Binary Tree to select the smallest element.

Heap sorting is unstable, and the algorithm time complexity is O (nlog n ).

2.5 Merge Sorting

There are two ordered (ascending) sequences stored in the adjacent positions of the same array. It may be set to A [l .. m], A [m + 1 .. h], merge them into an ordered series, and store them in A [l .. h].

The time complexity is O (nlog2n) in both the best and worst cases ).

2.6 fast sorting

Quick sorting is an essential improvement of Bubble sorting. The basic idea is that after scanning, the length of the sorting sequence can be greatly reduced. In Bubble sorting, A scan can only ensure that the maximum number of values is moved to the correct position, while the length of the sequence to be sorted may be reduced by 1. By performing a quick sorting scan, you can make sure that the numbers on the left of a certain number (based on it) are smaller than that on it, and the numbers on the right are larger than that on it. Then, we use the same method to process the numbers on both sides of it until there is only one element on the left and right of the benchmark.

Fast sorting is unstable, and the algorithm time complexity O (nlog2n) and Worst O (n ^ 2) are ideal ).

2.7 Hill sorting

In the direct insertion sorting algorithm, insert a number at a time to add only one node to the sequence, and it does not provide any help for inserting the next number. If a comparison is a long distance (called incremental) number that allows the number to move across multiple elements, then performing a comparison may eliminate the exchange of multiple elements. D. L. shell implemented this idea in the sorting algorithm named by him in 1959. The algorithm divides the number of groups to be sorted into several groups based on a certain increment d. The subscript difference between the records in each group is d. sort all the elements in each group, and then use a small increment to sort them in each group. When the increment is reduced to 1, the entire number to be sorted is divided into a group, and the sorting is completed.

Hill sorting is unstable, and its time complexity is O (n ^ 2 ).

Sorting type
Time Complexity
Space complexity
Stability

1
Insert sort
O (n2)
1


2
Hill sorting
O (n2)
1
×

3
Bubble Sorting
O (n2)
1


4
Select sort
O (n2)
1
×

5
Quick sorting
O (Nlogn)
O (logn)
×

6
Heap sorting
O (Nlogn)
1
×

7
Merge Sorting
O (Nlogn)
O (n)
√... Remaining full text>

Data Structure compromise search algorithm/selective sorting Bubble Sorting Algorithm

The half-lookup method is also called the binary lookup method. It fully utilizes the order relationship between elements and adopts the grouping policy. In the worst case, it can use O (log n) to complete the search task. The basic idea is to divide n elements into two halves with roughly the same number. Take a [n/2] for comparison with x to be searched, if x = a [n/2], locate x and terminate the algorithm. If x <a [n/2], we only need to search for x in the left half of array a (Here we assume that the array elements are in ascending order ). If x> a [n/2], we only need to search for x in the right half of array. Binary Search is widely used and easy to understand. However, it is not easy to write a correct binary search algorithm. The first binary search algorithm appeared as early as 1946, but the first fully correct binary search algorithm did not appear until 1962. In his book Writing Correct Programs, Bentley wrote that 90% of computer experts cannot write completely Correct binary search algorithms within two hours. The key to the problem is to accurately define the boundary of each search range and determine the termination conditions, and correctly summarize the various situations of parity, in fact, after sorting it out, we can find that its specific algorithm is very intuitive. We can use C ++ to describe it as follows:

Template <class Type>

Int BinarySearch (Type a [], const Type & x, int n)

{

Int left = 0;

Int right = n-1;

While (left <= right ){

Int middle = (left + right)/2;

If (x = a [middle]) return middle;

If (x> a [middle]) left = middle + 1;

Else right = middle-1;

}

Return-1;

}

Template Function BinarySearch in a [0] <= a [1] <=... <= a [n-1] x is searched for n elements in ascending order. If x is found, the position in the array is returned; otherwise,-1 is returned. It is easy to see that each execution of A while LOOP reduces the size of the array to be searched by half. Therefore, the time complexity of the entire algorithm in the worst case is O (log n ). When the data volume is large, its linear search performance is clear in terms of time complexity.
Select sort
The basic idea is: each time a record with a small I value is selected, it is placed at the position I (the starting point of I is 0. According to this statement, a 0th small record is actually the smallest, a bit awkward, no matter how much ). It's done when I = N-1.

Select sort directly
Direct selection and sorting simply reproduce the basic idea of selection and sorting. The first time we look for the smallest element, the cost is O (n). If we do not do some special processing, we use the simplest method of searching each time, naturally, the time complexity of the entire sorting is O (n2.

Bubble Method
To obtain the maximum value in a [1], we will combine a [1] with the elements a [2], a [3],..., a [10] for comparison. First, compare a [1] with a [2]. If a [1] <a [2], exchange a [1] with a [2]; otherwise, do not exchange. In this case, the large numbers in a [1] and a [2] are obtained. Then compare a [1] with a [3]. If a [1] <a [3], exchange a [1] with a [3, otherwise, do not switch. In this case, the maximum value in a [1], a [2], a [3] is obtained ,.... In this case, a [1] is compared with a [10]. If a [1] <a [10], a [1] is exchanged with a [10, otherwise, do not switch. In this way, the number obtained in a [1] is the maximum value of array a (9 comparisons in total ).

In order to obtain the secondary attention in a [2], the elements a [2] and a [3], a [4],..., a [10] for comparison. After eight comparisons, a [2] will get the secondary token.

In this case, the full text is...>

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.