JDK8 to HashMap made a large change and optimization, in the previous HashMap, is through the hash map + loading factor to achieve, each bucket has a corresponding linked list, when the hash map is uneven, a lot of key are mapped to the same bucket under the chain, this time, When the number of elements reaches a critical value, there are more elements in the map, the likelihood of conflict is greater, at this time rehash.
Implementation under 7: There are several key variables:
Threshold: The value of the capacity * Loadfactor of the map, the value of capacity twice times per expansion
loadfactorl:0.75f filling factor
Size: Number of actual key-value elements
Capacity:map the entry array size, initialized to 16
The simple next implementation, initialization map, entry[] Table array size of 16, loading factor of 0.75f, in the Put object, first calculate K hash value, and then according to the hash value to get the bucket of the array subscript, the mapping method is I = hash& ( TABLE.LENGTH-1), after the subscript, to traverse the corresponding bucket array of the list, if the same change and return the old value, does not exist the same object, then create a new entry, before the creation, will be judged:
public v put (K key, V value) {if (key = null) return Putfornullkey (value);
int hash = hash (key);
int i = indexfor (hash, table.length);
For (entry<k, v> e = table[i]; e!= null; e = e.next) {Object K;
if (E.hash = = Hash && ((k = e.key) = = Key | | key.equals (k))) {V oldValue = E.value;
E.value = value;
E.recordaccess (this);
return oldValue;
}} modcount++;
AddEntry (hash, key, value, I);
return null; }
void AddEntry (int hash, K key, V value, int bucketindex) {
if (size >= threshold) && (null!= table[bucket Index]) {
Resize (2 * table.length);
hash = (null!= key)? Hash (key): 0;
Bucketindex = Indexfor (hash, table.length);
}
Createentry (hash, key, value, Bucketindex);
First determine whether the number of all elements has reached a critical value (capacity * loadfactor), if the threshold is reached, the table's capacity to expand twice times (twice times, the hash needs to move the fewest number), and then rehash (head interpolation), The purpose element is then inserted into the chain list (i.e. the position of the bucket array subscript i) using the head interpolation method to avoid traversing the list again. This method avoids the case that the chain on the hash bucket is too long, that is, in extreme cases, the hash conflict is mapped to the same bucket.
The above picture is to 7 HashMap simple description, here is only the image description, is not accurate, the hash bucket quantity as well as rehash after the position does not calculate, here just the image explanation.
We continue to look at the improvement of the JDK8 to Hasnmap, of which several important factors are still the same. The structure of the hashmap was improved. In simple terms, the TreeNode node type is added, and the list is changed to a red-black number structure when the length of the list is increased to a certain value (this optimization is OLOGN for complexity in extreme cases).
The new properties are:
/**
* The bin count threshold for using a tree rather than list for a
* bin. Bins are converted to trees when adding an element to a
* bin with at least this many nodes. The value must be greater
* than 2 and should is at least 8 to mesh with assumptions about
removal Rsion back to plain bins upon
* shrinkage.
* *
static final int treeify_threshold = 8;
/**
* The bin count threshold for untreeifying a (split) bin during a
* Resize operation. Should be less than treeify_threshold, and in
* Most 6 to mesh with shrinkage detection under.
* *
static final int untreeify_threshold = 6;
As we can see from the annotations, these two factors determine when to rehash the list as a red-black tree.
Let's look at the constructor first.
Public HashMap () {
this.loadfactor = default_load_factor;//All other fields defaulted
}
This only initializes the 0.75f loading factor, while the rest of the initialization information is first completed on put.
Public V-Put (K key, v. value) {return
Putval (hash (key), key, value, false, True);
}
Static final int hash (Object key) {
int h;
return (key = = null)? 0: (H = key.hashcode ()) ^ (h >>>);
}
Final V putval (int hash, K key, V value, Boolean onlyifabsent, Boolean evict) {Node<k,v> ; [] tab; Node<k,v> p;
int n, I; if (tab = table) = = NULL | |
(n = tab.length) = = 0) n = (tab = resize ()). length;
if (p = tab[i = (n-1) & hash) = = null) tab[i] = NewNode (hash, key, value, NULL); else {node<k,v> E;
K K; if (P.hash = = Hash && (k = p.key) = = Key | |
(Key!= null && key.equals (k)))
e = p;
else if (P instanceof TreeNode) e = ((treenode<k,v>) p). puttreeval (This, tab, hash, key, value);
else {for (int bincount = 0; ++bincount) {if ((E = p.next) = = null) {
P.next = NewNode (hash, key, value, NULL); if (bincount >= treeify_threshold-1)//-1 for 1st TreeifybIn (tab, hash);
Break } if (E.hash = = Hash && (k = e.key) = = Key | |
(Key!= null && key.equals (k)))
Break
p = e;
} if (e!= null) {//Existing mapping for key V oldValue = E.value;
if (!onlyifabsent | | oldValue = = NULL) E.value = value;
Afternodeaccess (e);
return oldValue;
}} ++modcount;
if (++size > Threshold) resize ();
Afternodeinsertion (evict);
return null; }
Final node<k,v>[] Resize () {node<k,v>[] oldtab = table; int Oldcap = (Oldtab = null)?
0:oldtab.length;
int oldthr = threshold;
int Newcap, newthr = 0;
if (Oldcap > 0) {if (Oldcap >= maximum_capacity) {threshold = Integer.max_value;
return oldtab; else if ((Newcap = oldcap << 1) < maximum_capacity && Oldcap >= DEF ault_initial_capacity) Newthr = oldthr << 1; Double threshold} else if (Oldthr > 0)//initial capacity is placed in threshold NEWC
AP = Oldthr;
else {//Zero initial threshold signifies using defaults newcap = default_initial_capacity;
NEWTHR = (int) (Default_load_factor * default_initial_capacity);
} if (Newthr = 0) {Float ft = (float) newcap * loadfactor; NewThr = (Newcap < maximum_capacity && ft < (float) maximum_capacity?
(int) Ft:Integer.MAX_VALUE);
} threshold = Newthr;
@SuppressWarnings ({"Rawtypes", "Unchecked"}) node<k,v>[] Newtab = (node<k,v>[)) new Node[newcap];
Table = Newtab;
if (Oldtab!= null) {for (int j = 0; j < Oldcap; ++j) {node<k,v> e;
if ((e = oldtab[j])!= null) {OLDTAB[J] = null;
if (E.next = = null) Newtab[e.hash & (newCap-1)] = e;
else if (e instanceof TreeNode) ((treenode<k,v>) e). Split (This, Newtab, J, Oldcap);
else {//preserve order node<k,v> Lohead = null, lotail = NULL;
Node<k,v> hihead = null, hitail = NULL;
Node<k,v> Next; do {next = E.next;
if ((E.hash & oldcap) = = 0) {if (Lotail = null)
Lohead = e;
else Lotail.next = e;
Lotail = e;
else {if (Hitail = null)
Hihead = e;
else Hitail.next = e;
Hitail = e;
} while ((E = next)!= null);
if (lotail!= null) {lotail.next = null;
NEWTAB[J] = Lohead; } if (Hitail!= null) {Hitail.next = null;
Newtab[j + oldcap] = Hihead;
}}} return newtab; }
When put, the entry[] table array is null, initializes the length of this array to 16, and then sets the critical value threshold to 16*0.75f = 12, and the threshold value is recalculated each time twice times the array is extended. When these initializations are completed, the hash value is computed and then,hash& (table.length-1), as in 7, is the array subscript, then creates the linked list node and assigns the reference to Table[i]. After each pput node, if the target array i is empty, the new node is created directly and the reference is assigned to Table[i], otherwise, if the hash and object of the table[i] and the put node are identical, replace directly and, if not satisfied, see Table[i] which node type, if the tree node , then call Table[i] 's
The Puttreeval method inserts a node into the tree, not a tree node, or a linked list node. Traversal Table[i] Point to the linked list, when the number reached 8, the list modified to red and black tree and insert the node, otherwise, the number of linked list has not reached 8, do not need to refactor to red-black trees, the node inserted in the tail of the list.
if (++size > Threshold)
resize ();
Finally, verify that the number of all elements is greater than the critical value, then resize, and expand the array to 2. Optimizes the list to a tree, and at worst, optimizes the complexity of HashMap in version 7 from O (n) for O (Logn).
A picture of the JDK8 version of HashMap was found on the internet, which is added to the list.