In layman's Java Concurrency (23): Concurrent container Part 8 can be blocked Blockingqueue (3) [Go]

Source: Internet
Author: User
Tags comparable

There is a sorted set in set SortedSet, which is used to hold objects arranged in a natural order. A FIFO model that supports sequencing is also introduced in the queue.

Introduction to the concurrent queue and queue describes that Priorityqueue and Priorityblockingqueue are queue queues that support sorting. Obviously a sort of queue that supports blocking is much more complex to implement than a non-thread-safe queue, so the following is only about Priorityblockingqueue, which is basically the same as priorityqueue just need to remove blocking functionality.

Sort of Blockingqueue-priorityblockingqueue

The next priorityqueue is introduced briefly, because the Priorityblockingqueue interior is implemented through Priorityqueue adaptation, but only through the lock synchronization and blocking.

Priorityqueue is an array implementation, is a binary tree implementation, the binary tree any node is smaller than its child nodes, so that the vertex is the smallest node. Each element or node is either itself comparable (comparable), or the queue itself has a comparator (comparator<?). Super e>), all elements are determined by comparing their size to the order. The vertex in the array is the NO. 0 element of the array, so the queue always takes the No. 0 element. For the No. 0 element, its child node is the 1th element and the 2nd element, for the 1th element, its child element is the 3rd/4 elements, and so on, the first element of the parent node is (i-1)/2. This allows any element to join the queue starting from its parent node (I-1)/2, and once the new node is smaller than the parent node, swap two nodes, and then continue to compare the new node with its new parent node. Know that all nodes are arranged in the same order that the parent node must be smaller than the child nodes. This is a somewhat complex algorithm and no more details are discussed here. Whether you are deleting or looking, we only need to know the vertices (elements indexed as 0) are always minimal.

It is particularly necessary to note that Priorityqueue is an unbounded queue, which means that once the number of elements reaches the size of the array, the array is enlarged by 50%, so that the array is infinitely large. Of course, if the maximum value of an integer is reached, a outofmemoryerror is obtained, which is guaranteed by logic.

For Priorityblockingqueue, because it is unbounded, there is only a non-empty signal, which means that only take () can block and put is never blocked (unless the integer.max_ is reached Value until a OutOfMemoryError exception is thrown).

Only the take () operation can be suspended because the queue is empty. At the same time other needs to operate the queue changes and size only need to use exclusive lock reentrantlock on it, very convenient. It is necessary to note that the Priorityblockingqueue has adopted a fair lock.

In general, Priorityblockingqueue is not a FIFO queue, but an orderly queue, which always takes the "natural order" of the smallest object, but also a blockingqueue can only be out of queue blocking, for the queue is not blocked. All operations are thread-safe.

Direct exchange of Blockingqueue-synchronousqueue

This is an interesting blocking queue in which each insert operation must wait for the removal of another thread, and any removal operation waits for another thread's insert operation. So there is actually no element inside this queue, or capacity is 0, strictly speaking is not a container. The peek operation cannot be called because the queue has no capacity, because elements are only available when the element is removed.

What's the use of a concurrent queue with no capacity? Or what is the meaning of existence?

The implementation of Synchronousqueue is very complex, of course, if you really want to analyze or can get some experience, but the previous analysis of excessive structure, found increasingly trapped in data structures and algorithms inside. My intention is to use concurrency to make the best use of available resources by studying the principles of concurrency. So in the following chapters as far as possible in the study of data structures and algorithms, but in order to understand the principle of the inside, it will be necessary to avoid some of this knowledge, hope that the back can be enough.

Back to the topic. There is no capacity inside the synchronousqueue, but because an insert operation always corresponds to a removal operation, the reverse is also required. Then an element will not synchronousqueue inside for a long time, and once the insert thread and the thread are removed, the element is quickly transferred from the insertion thread to the removal thread. This means that this is more like a channel (pipeline), where resources are quickly passed from one direction to another.

In particular, although the element does not "stay" within Synchronousqueue, it does not mean that the synchronousqueue does not have a queue inside. In fact, synchronousqueue the maintainer thread queue, that is, the insert thread or the removal thread will be wired when it does not exist simultaneously. Since there is a queue, there is also fairness and unfairness, and fairness guarantees that the inserted thread being waited on or the removal thread will pass the resource in FIFO order.

Obviously this is a fast way to pass elements, which means that in this case the element is always in the quickest way from the inserted (producer) to the removed (consumer), which in the multitasking queue is the quickest way to process the task. This feature is also mentioned more in the related chapters of the online pool.

In fact, there is also a blockingqueue implementation delayqueue described in "Introduction to concurrent queues and queue", which describes a delay queue. The characteristic of this queue is that the elements in the queue are deferred (time-out), and only one element reaches the delay time to get out of the queue, meaning that each element fetched from the queue is always the first to arrive at the deferred element. The scenario for such a queue is a scheduled task. For example, in the past to complete the planning task, it is possible to use Timer/timertask, which is a way of cyclic detection, that is, in the loop all elements are always detected to detect whether the elements meet the conditions, once the conditions are met to perform the relevant task. Obviously this method wastes a lot of testing work, because most of the time is always doing unnecessary testing. But Delayqueue can avoid this kind of unnecessary detection. This queue implementation is also discussed in more detail in the Scheduled Tasks section of the online pool.

Below is a section on common Blockingqueue, which does not include a two-way queue, although concurrentlinkedqueue is not a blocking queue, but it is also used to compare it together.

If you do not need to block queues, prioritize concurrentlinkedqueue; If you need to block queues, queue size fixed priority Select Arrayblockingqueue, queue size is not fixed priority selection Linkedblockingqueue Select Priorityblockingqueue If you need to sort the queue, select Synchronousqueue if you need a fast-switched queue, or select Delayqueue if you need to defer the elements in the queue.

In layman's Java Concurrency (23): Concurrent container Part 8 can be blocked Blockingqueue (3) [Go]

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.