One, red and black tree introduction
What is a red-black tree? Why do you need a red-black tree?
finding, inserting, deleting, finding the maximum node, finding the minimum node, finding the precursor/successor node is a common requirement, how to find a data structure to efficiently implement the preceding basic operations? According to this blog post to the basic introduction of trees, probably also solve the characteristics of various trees. Although the AVL tree guarantees a variety of basic operations in O (LOGN), its rotational operations are complex and more commonly used are red-black trees.
The red and black tree is a special two-fork search tree. It ensures that in the worst case, the time complexity of all basic operations listed above is O (logn)
So how does it do that? The nature of the red and black trees listed below guarantees a logarithmic time complexity.
① Each node is either red or black.
The ② root node is black.
③ each leaf node (NIL) is black-note: The leaf nodes here are different from the leaf nodes of the tree that is normally discussed. the leaf node here is the data field is Nil node , and the usual said leaf node is: its left and right children for nil node.
④ If the knot is red, its sons are black.
⑤ contains the same number of black nodes on all paths from the node to its descendants node for each node.
The above five-point nature ensures that the height of the red-black tree is logn, thus ensuring that the time complexity of all basic operations is O (LOGN)
Second, why the red and black trees can ensure that the time complexity of the basic operations for the number of levels O (LOGN)
Now to prove why the height of the red-black tree is logn.
Proof: The height of a red black tree with n nodes is at most 2log (n+1)
Leaf node: A node where the data field is nil. Inner node (non-leaf node): The data field is not a nil node. Each node in the red-black tree has five fields (field): ① Color ② data field (key) ③ left pointer ④ right pointer ⑤ parent node pointer
First define a black height concept: Starting from a node x, the number of black nodes is called the black height on any path that reaches a leaf node. Recorded as BH (x)
Firstly, the mathematical induction method can prove that: at least 2BH (x)-1 inner nodes are included in the subtree with node x as root.
Secondly, assuming that the height of the red-black Tree is H (the height of the root node is 0), it is ④ by nature that at least half of the nodes are black from the root node (excluding the root) to any simple path of the leaf node, so that the black height of the root is at least H/2
Thus there are: N >= 2h/2-1, to derive H <= 2log (n+1), thus proving that the height of the red-black tree is O (logn)
In this case: N=3,h=1, the black height of the root is 1
In this case: n=15,h=3, the black height of the root is 1
Three, the basic operation of red and black trees
① insert operation--specific implementation details are not discussed.
The Insert section is divided into three main steps:
The first step: similar to inserting a node in the two-fork lookup tree, the nodes to be inserted are compared recursively with the nodes in the red-black tree, and if the node weights to be inserted are large, they are inserted into the right subtree, and if the node weights to be inserted are small, they are inserted into the left sub-tree.
Step two: By the first step, the node to be inserted is now placed on the last layer of the red-black tree, marking the node as red.
Step Three: Adjust the inserted nodes to ensure that the properties of the red and black trees are not destroyed. The adjustments mainly include various rotary operations. and the placement will only change the pointer's direction, so the time complexity of the rotation is O (1)
② Delete operation
Deleting a node also destroys the nature of the red-black tree, so it needs to be adjusted. Specific details refer to the introduction to algorithms
Red and Black Tree Summary (1)