Concurrent Package Summary-thread-safe collection operations

Source: Internet
Author: User
Tags sorts

Java provides a rich collection of class operations, presumably can be divided into unordered combination of set, ordered set of list and unordered key value pairs set map. Queue operation was added after Java5. Java1.5 added a thread-safe collection operation class to block in the Java.util.concurrent package. This article explores only the thread-safe binding operations classes under the package.

First look at the class diagram structure of the thread-safe class under the concurrent package:

1.CopyOnWriteArraySet class

The bottom of the Copyonwritearraylist class is implemented by Copyonwritearraylist. So it has the same properties as Copyonwritearraylist:

    • When the number of elements is smaller, and the read operation is much larger than the write operation, it is appropriate to use the collection class
    • It is thread-safe
    • Write operations (Add (), set (), remove (), etc) are expensive and inefficient. Because the implementation is dependent on the replication of the underlying array at the time of the write operation.
    • The iterator does not support variable delete operations.
    • When you construct an iterator, you rely on a snapshot of the underlying immutable array, so the traversal is fast and does not collide with other threads.

2.CopyOnWriteArrayList class

Copyonwritearraylist is a variant of the thread-safe class, and all writes are done by copying the underlying array, so the write operation of the collection class is expensive. But Copyonwritearraylist is also the way to achieve the high concurrency of read operations through the above snapshot. Because the implementation of the underlying code is to store elements through an array, the collection always returns the corresponding element in the array at the moment of the calling method. Ignored for concurrent write operations on other threads.

It is also useful when you cannot or do not want to perform synchronous traversal, but you need to exclude conflicts from concurrent threads. The snapshot style iterator method uses a reference to the array state when creating the iterator. This array is never changed during the lifetime of the iterator, so there is no possibility of a conflict and the iterator guarantees that concurrentmodificationexception will not be thrown. The iterator does not reflect the addition, removal, or change of the list since the iterator was created. The action of changing elements on iterators (remove, set, and add) is not supported. These methods will throw unsupportedoperationexception.

3.ConcurrentLinkedQueue

A link-node-based, unbounded, thread-safe queue. This queue sorts the elements according to the FIFO (first-in, in-out) principle. The head of the queue is the longest element in the queue. The tail of the queue is the element with the shortest time in the queue. The new element is inserted at the end of the queue, and the queue retrieval operation obtains the element from the queue header. Concurrentlinkedqueue is an appropriate choice when many threads share access to a common collection. This queue does not allow null elements.

This implementation employs an efficient "no Wait (wait-free)" algorithm. Unlike most collection, the size method is not a fixed-time operation. Because of the asynchronous nature of these queues, determining the number of current elements requires traversing those elements.

And the operation of the queue collection is non-blocking.

4.BlockingQueue Interface

Blockingqueue not only realized the basic functions of a complete queue, but also managed to automatically wait for wake-up between multiple lines in a multithreaded environment, allowing programmers to ignore these details and focus on more advanced features.

Features of Blockingqueue:

    • Blockingqueue is a blocking queue, that is, if the queue is empty, the thread that takes the element out of the queue in which the method is called will be blocked into a blocking state until the new element is placed in the queue to wake the blocking thread, whereas if a blockingqueue is full, The thread that calls the method to add elements to the queue is also blocked into a blocking state, knowing that there is extra space in the queue to be awakened.
    • Blockingqueue does not accept null elements
    • The blockingqueue can be a limited capacity
    • Mainly used by producers-consumer models
    • The implementation of Blockingqueue is thread-safe

Common methods of Blockingqueue:

Add (E O): Inserts the specified element into the queue (if the queue still has space), returns true after execution succeeds, or throws illegalsatateexception directly.

Offer (E O): Inserts the specified element into the queue, returns true if successful, or false

Put (E O): The specified element is inserted in the queue and waits if there is no space in the queue. (blocked)

Poll (long timeout, timeunit unit): Retrieves and removes the header of this queue, and waits for the specified wait time, if necessary, if there are no elements in this queue.

Take (): retrieves and removes the header of this queue, and waits if there are no elements in this queue. (blocked)

4.1 arrayblockingqueue ( bounded cache, FIFO, support fair access policy )

(1) A bounded blocking queue supported by the array. This queue sorts elements by FIFO (first-in, in-out ) principle.

(2) The head of the queue is the element that has the longest time in the queue, and the tail of the queue is the element with the shortest time in the queue. The new element is inserted at the end of the queue, and the queue retrieval operation starts from the head of the queue to get the element.

(3) This is a typical " bounded buffer ", in which the fixed-size array keeps the elements inserted by the producer and the user extracts the elements. Once such a buffer is created, it is no longer possible to increase its capacity. Attempting to put an element into the full queue causes the put operation to be blocked, and attempting to retrieve an element from an empty queue will result in a similar blocking.

(4) This class supports an optional fairness policy for ordering the waiting producer and consumer threads . By default, this sort is not guaranteed . However, a queue constructed by setting fairness (fairness) to true allows threads to be accessed in FIFO order . Fairness typically reduces throughput, but it also reduces variability and avoids "imbalances".

4.2 linkedblockingqueue ( unbounded )

(1) The list-based blocking queue, the capacity is arbitrary. Provides an optional construction method of the specified capacity to prevent excessive expansion of the queue, and if no capacity is specified, its capacity is equal to Integer.max_value. Each insert dynamically inserts a new node unless the inserted node already exceeds the capacity of the queue.

(2) This queue is also sorted according to FIFO principle when storing elements. The head element of the queue spends the longest time in the queue, the tail element is the shortest time, and each insertion element is inserted into the tail of the queue.

4.3 delayqueue ( unbounded )

(1) An unbounded blocking queue of the Delayed element from which the element can be extracted only when the delay expires.

(2) The head of the queue is the Delayed element (not the first element to be placed) that has been saved for the longest time after the delay expires. If the delay has not expired, the queue does not have a header, and poll returns NULL.

(3) An expiration occurs when the Getdelay (Timeunit.nanoseconds) method of an element returns a value less than or equal to zero.

(4) The elements of the queue must intern that delayed interface.

4.4 priorityblockingqueue ( unbounded )

(1) An unbounded blocking queue that uses the same order rules as the class Priorityqueue: the priority-based blocking queue (which is determined by the constructor's incoming Compator object), and provides a blocking retrieval operation.

(2) Although this queue is logically unbounded, attempting to perform an add operation may fail (resulting in outofmemoryerror) because the resource is exhausted.

(3) This class does not allow the use of NULL elements.

(4) Priority queues that rely on natural order also do not allow the insertion of objects that cannot be compared (because doing so throws classcastexception).

4.5 Synchronousqueue

(1) A blocking queue in which each put must wait for a take and vice versa. The synchronization queue does not have any internal capacity, and even one queue has no capacity.

(2) Peek cannot be performed on the synchronization queue because the element exists only when an element is being attempted, and unless another thread attempts to remove an element, the element cannot be added (using any method) or the queue is not iterated because there are no elements available for the iteration.

(3) The header of the queue is the first queued thread element that is attempted to be added to the queue, and if there are no queued threads, the element is not added and the header is null. For other Collection methods (for example, contains), Synchronousqueue as an empty collection. This queue does not allow null elements.

(4) Synchronization queues are ideal for transitive designs in which objects that run in one thread must be synchronized with an object that is running in another thread to pass certain information, events, or tasks to objects running in the other.

(5) This class supports an optional fair ordering policy for both the producer and consumer threads that are waiting. This sort is not guaranteed by default. However, queues constructed with fair set to true ensure that threads are accessed in a FIFO order. Fairness usually lowers throughput, but it can reduce variability and avoid service loss.

 

Concurrent Package Summary-thread-safe collection operations

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.