Blockingqueue Introduction to sharing data between Java multithreading

Source: Internet
Author: User
Tags addall throw exception

In Java's concurrent package, Blockingqueue solves the problem of how to efficiently and securely "transfer" data in multiple threads. These efficient and thread-safe queue classes provide great convenience for us to quickly build high-quality multi-threaded threads. This article details all the members of the Blockingqueue family, including their respective features and common usage scenarios.

I. Understanding Blockingqueue

A blocking queue, as the name implies, is first a queue, and a queue plays a role in the data structure roughly as shown:


From what we can clearly see, through a Shared queue, you can make the data input from one end of the queue, and output from the other end;
There are two commonly used queues: (of course, different implementations can also extend many different types of queues, Delayqueue is one of them)
FIFO: The elements of the first inserted queue are also first out of the queue, similar to the queued functionality. To some extent, this kind of queue also embodies a kind of fairness.
Last in, first Out (LIFO): The element that is inserted into the queue is first out of the queue, and this queue takes precedence over the most recent occurrence.

In a multithreaded environment, data sharing can easily be achieved through queues, such as the classic "producer" and "consumer" models, where data sharing between the two can be conveniently achieved through queues. Let's say we have a number of producer threads, and another number of consumer threads. If the producer thread needs to share the prepared data to the consumer thread and use the queue to pass the data, it is easy to solve the data sharing problem between them. But what if the producer and consumer are in a certain time period, in case the data processing speed does not match? Ideally, if the producer produces data at a faster rate than the consumer, and when the resulting data accumulates to a certain extent, then the producer must pause (block the producer thread) to wait for the consumer thread to finish processing the accumulated data, and vice versa. However, before the concurrent package was released, in a multithreaded environment, each of our programmers had to take control of these details, especially in the context of efficiency and thread safety, which would bring a great degree of complexity to our programs. Fortunately at this time, the powerful concurrent package turned out, and he also brought us a strong blockingqueue. (In the Multithreaded realm: so-called blocking, in some cases hangs a thread (that is, blocking), and once the condition is met, the suspended thread is automatically awakened) the following two illustrations illustrate the two common blocking scenarios for Blockingqueue:

As shown: When there is no data in the queue, all threads on the consumer side are automatically blocked (suspended) until there is data in the queue.

As shown: When the queue is filled with data, all threads on the producer side are automatically blocked (suspended) until the queue has empty positions and the thread is automatically awakened.

This is why we need to blockingqueue in a multi-threaded environment. As a blockingqueue user, we no longer need to care about when we need to block threads and when to wake up threads, because all of this blockingqueue you handedly. Since Blockingqueue is so well-known, let's take a glimpse of its common methods:

Second, the common method of Blockingqueue definition

Throw exception

Special values

Blocking

Timeout

Insert

Add (E)

Offer (e)

Put (e)

Offer (E,time,unit)

Removed from

Remove ()

Poll ()

Take ()

Poll (Time,unit)

Check

Element ()

Peek ()/isempty ()

No

No

Where: null is used as a warning value indicating that the poll operation failed.

1) Insert object:

Blockingqueue NULL objects are not accepted. Throws NullPointerException when attempting to add, put, or offer a null object.

Add (object): Adds an object to the Blockingqueue and returns True if Blockingqueue can hold it, otherwise throws an exception.

Offer (object): Add object to Blockingqueue, if Blockingqueue can accommodate, return true, otherwise return false, this method does not block the current execution method of the thread, offer (E O, long  Timeout, timeunit unit), you can set the time to wait, if you cannot join Blockingqueue within a specified time, then return false. Put (object): Add object to Blockingqueue, if Blockqueue has no space, the thread calling this method is blocked until Blockingqueue has space to add to Blockingqueue.

2) Get the object:

Poll (time): Take the Blockingqueue in the first row of the object, if not immediately remove, you can wait for the times specified by the parameters, and then return to Null;poll (long timeout, timeunit unit) : Extracts a team header object from Blockingqueue, and returns the data in the queue as soon as the queue has data, if any, within a specified time. Otherwise, the time has not passed before the data is preferable, returning null.

Take (): Takes the Blockingqueue in the first place of the object, if the blockingqueue is empty, blocking the data thread to enter the waiting state until the Blockingqueue has newly added objects are taken away;

Drainto (): Get all available data objects from Blockingqueue at once, or get the specified number of data objects from Blockingqueue at once, this method can improve the efficiency of data acquisition, and do not need to lock or release locks more than once in batches.

3) Checking objects

Element () Gets the first object in the Blockingqueue and is not removed from the queue. Throws an exception if the queue has no objects.

Peek () Gets the first object in the Blockingqueue and does not remove it from the queue. Returns a null object if the queue has no objects.

IsEmpty () Determines whether there are objects in the blockingqueue, if NULL, returns TRUE if NOT NULL, returns FALSE.

Third, common Blockingqueue

After understanding the basic functions of blockingqueue, let's take a look at the Blockingqueue family members.

1. Arrayblockingqueue

Array-based blocking queue implementation, within Arrayblockingqueue, maintains a fixed-length array to cache data objects in the queue, which is a common blocking queue, in addition to a fixed-length array, the Arrayblockingqueue interior also holds two shaping variables, Identifies the position of the queue's head and tail in the array, respectively. Arrayblockingqueue the same lock object is shared between the producer and the consumer, which means that the two cannot actually run in parallel, which is particularly different from the linkedblockingqueue, and according to the principle of implementation, The arrayblockingqueue can be fully split-lock, enabling full parallel operation of both producer and consumer operations. Doug Lea did not do this, perhaps because Arrayblockingqueue's data write and fetch operations are lightweight enough to introduce an independent locking mechanism that, in addition to adding additional complexity to the code, is not a good performance benefit. One notable difference between Arrayblockingqueue and Linkedblockingqueue is that the former does not produce or destroy any additional object instances when inserting or deleting elements, while the latter generates an additional node object. This has a certain difference in the effect of GC on a system that needs to handle large quantities of data efficiently and concurrently over a long period of time. When creating Arrayblockingqueue, we can also control whether an object's internal lock is a fair lock, and an unfair lock is used by default.

Import Java.util.concurrent.blockingqueue;import Java.util.concurrent.executorservice;import Java.util.concurrent.executors;import Java.util.concurrent.LinkedBlockingQueue; public class Blockingqueuetest {public static void main (string[] args) throws Interruptedexception {//declares a capacity                The cache queue for 10 blockingqueue<string> queue = new linkedblockingqueue<string> (10);        Queue.add ("SSS");        The test adds a null object try{queue.add (null);        } catch (Exception ee) {ee.printstacktrace ();          } try{Queue.offer (null);          } catch (Exception ee) {ee.printstacktrace ();          } try{queue.put (null);          } catch (Exception ee) {ee.printstacktrace ();        } Producer producer1 = new Producer (queue);        Producer producer2 = new Producer (queue);        Producer Producer3 = new Producer (queue); Consumer Consumer = new Consumer (quEue);        With executors Executorservice service = Executors.newcachedthreadpool ();        Start thread Service.execute (producer1);        Service.execute (PRODUCER2);        Service.execute (PRODUCER3);         Service.execute (consumer);        Perform 10s Thread.Sleep (10 * 1000);        Producer1.stop ();        Producer2.stop ();         Producer3.stop ();        Thread.Sleep (2000);    Exit executor Service.shutdown (); }}import Java.util.random;import Java.util.concurrent.blockingqueue;import Java.util.concurrent.TimeUnit;        /** * Consumer Thread */public class Consumer implements Runnable {public Consumer (blockingqueue<string> queue) {    This.queue = queue; } public void Run () {System.out.println ("Start consumer thread!        ");        Random r = new Random ();        Boolean isrunning = true;                try {while (isrunning) {System.out.println ("Fetching data from Queue ...");           String data = Queue.poll (2, timeunit.seconds);     if (null! = data) {System.out.println ("Get Data:" + database);                    SYSTEM.OUT.PRINTLN ("Consuming data:" + data);                Thread.Sleep (R.nextint (default_range_for_sleep));                    } else {//more than 2s has no data, thinking that all production lines have exited and automatically exits the consumer thread.                IsRunning = false;            }}} catch (Interruptedexception e) {e.printstacktrace ();        Thread.CurrentThread (). interrupt (); } finally {System.out.println ("Exit the consumer thread!        ");    }} private blockingqueue<string> queue; private static final int default_range_for_sleep = 1000;} Import Java.util.random;import Java.util.concurrent.blockingqueue;import Java.util.concurrent.timeunit;import Java.util.concurrent.atomic.AtomicInteger;        /** * Producer Thread * * @author Jackyuj */public class Producer implements Runnable {public Producer (Blockingqueue queue) {    This.queue = queue;        } public void Run () {String data = null;         Random r = new Random (); System.out.println ("Start the producer thread!")        ");                try {while (isrunning) {System.out.println ("In production data ...");                 Thread.Sleep (R.nextint (default_range_for_sleep));                data = "Data:" + count.incrementandget ();                SYSTEM.OUT.PRINTLN ("Put data:" + Data + "into queue ...");                if (!queue.offer (data, 2, timeunit.seconds)) {System.out.println ("failed to put data:" + ");            }}} catch (Interruptedexception e) {e.printstacktrace ();        Thread.CurrentThread (). interrupt (); } finally {System.out.println ("Exit producer Thread!        ");    }} public void Stop () {isrunning = false;    } private Volatile Boolean isrunning = true;    Private Blockingqueue queue;    private static Atomicinteger count = new Atomicinteger (); private static final int default_range_for_SLEEP = 1000;} 

2. Linkedblockingqueue

A linked list-based blocking queue, similar to Arraylistblockingqueue, maintains a data buffer queue (which is made up of a list of lists), and when the producer puts a data into the queue, the queue fetches the data from the producer and caches it inside the queue. The producer returns immediately; the producer queue is blocked until the queue buffer reaches the maximum cache capacity (Linkedblockingqueue can be specified by the constructor) until the consumer consumes a piece of data from the queue, and the producer thread is awakened. On the contrary, the consumer side of the processing is based on the same principle. While Linkedblockingqueue is able to efficiently handle concurrency data, it also uses separate locks for both producer and consumer to control data synchronization, which means that producers and consumers can operate the data in the queue in parallel with high concurrency, This improves the concurrency performance of the entire queue.

As a developer, it is important to note that if you construct a Linkedblockingqueue object without specifying its capacity size, Linkedblockingqueue will default to a capacity (Integer.max_value) that is like an infinite size, In this case, if the producer's speed is greater than the consumer's speed, perhaps not until the queue is full of congestion, the system memory may have been exhausted.

Arrayblockingqueue and Linkedblockingqueue are the two most common and most commonly used blocking queues, and in general, in dealing with producer consumer issues between multiple threads, use these two classes enough.

3. Delayqueue

The element in the Delayqueue can be fetched from the queue only if its specified delay time is reached. Delayqueue is a queue with no size limit, so the operations (producers) that insert data into the queue are never blocked, and only the operations (consumers) that get the data are blocked.

Usage Scenario: Delayqueue uses fewer scenes, but is quite ingenious, and common examples include using a delayqueue to manage a queue of connections that are not responding in time-outs.

4. Priorityblockingqueue

Priority-based blocking queues (priority judgments are determined by the Compator objects passed into the constructor), but it is important to note that Priorityblockingqueue does not block data producers, but only the consumers who block the data when there is no consumable data. It is therefore important to pay particular attention to the fact that the producer produces data at a speed that is not faster than the consumer's consumption of data, or that it will eventually deplete all available heap memory space for a long time. When implementing Priorityblockingqueue, the internal control thread synchronizes the lock with a fair lock.

The objects stored inside the Priorityblockingqueue must be implemented comparable interfaces. The queue determines the priority of an object through the Compare method of the interface.

5. Synchronousqueue

A buffer-free waiting queue, similar to a non-intermediary direct transaction, a bit like a primitive society of producers and consumers, producers take products to the market to sell to the final consumer products, and consumers must go to the market to find the direct producers of goods, if one side did not find the right target, then I am sorry, Everyone is waiting in the market. Compared to the buffer blockingqueue, there is a link between the intermediate dealers (buffer), if there is a dealer, the producers directly wholesale the product to the dealer, regardless of the dealer will eventually sell these products to those consumers, because the dealer can stock a part of the goods, Therefore, compared to the direct trading model, the overall use of intermediate dealer mode will be higher throughput (can be sold in bulk); On the other hand, because of the introduction of the dealer, the product from the producer to the consumer to add additional trading links, the performance of a single product can be reduced in time.

There are two different ways to declare a synchronousqueue, and they behave differently. The difference between fair mode and non-equity mode:

If the use of fair mode: Synchronousqueue will be a fair lock, and with a FIFO queue to block redundant producers and consumers, so that the overall system of fairness strategy;

But if the non-fair mode (synchronousqueue default): Synchronousqueue with an unfair lock, with a LIFO queue to manage the surplus producers and consumers, the latter model, if the producers and consumers have a gap in processing speed, is prone to hunger and thirst, where data from certain producers or consumers may never be processed.

6. Summary

Blockingqueue not only realized the basic functions of a complete queue, but also managed to automatically wait for wake-up between multiple lines in a multithreaded environment, allowing programmers to ignore these details and focus on more advanced features.

Arrayblockingqueue: The specified size of the Blockingqueue, its constructor must take an int parameter to indicate its size. The objects contained are sorted in FIFO (first in, first out) order;

Linkedblockingqueue: Blockingqueue of variable size, if its constructor takes a specified size parameter, the resulting blockingqueue has a size limit, without the size parameter, The size of the generated blockingqueue is determined by Integer.max_value. The objects it contains are sorted in FIFO (first in, first out) order;

Priorityblockingqueue: Similar to Linkedblockqueue, but the sort of objects it contains is not FIFO, but is based on the natural sort order of the object or the order of the comparator decision of the constructor;

Synchronousqueue: Special Blockingqueue, the operation of which must be put and take alternating completion;

where Linkedblockingqueue and Arrayblockingqueue compare, the data structure behind them is different, resulting in linkedblockingqueue data throughput greater than Arrayblockingqueue,

However, when the number of threads is large, its performance is less predictable than arrayblockingqueue.

Iv. several points of attention of Blockingqueue

1.BlockingQueue can be of limited capacity.

It can have a remainingcapacity at any given time, exceeding this capacity, and cannot put additional elements without blocking. Blockingqueue that do not have any internal capacity constraints always report the remaining capacity of the integer.max_value.

The 2.BlockingQueue implementation is primarily used for producer-consumer queues and also supports the collection interface.

For example, it is possible to use remove (x) to remove any element from the queue. However, this operation is usually not performed effectively and can only be used on a scheduled occasion, such as when the queued information is canceled.

The 3.BlockingQueue implementation is thread-safe.

All queueing methods can use internal locks or other forms of concurrency control to automatically achieve their purpose. However, a large number of collection operations (AddAll, Containsall, Retainall, and RemoveAll) are not necessary for automatic execution unless specifically stated in the implementation. So, for example, AddAll (c) might fail (throw an exception) after adding only some of the elements in C.

4.BlockingQueue essentially does not support the use of any one of the "close" or "shutdown" actions to indicate that no more items are added.

The need for and use of this functionality has a tendency to rely on implementation. For example, a common strategy is to insert special end-of-stream or poison objects for the producer, and interpret them based on the time the consumer obtains them.

Reference article: http://wsmajunfeng.iteye.com/blog/1629354

http://zzhonghe.iteye.com/blog/826757

http://blog.csdn.net/xin_jmail/article/details/26157971

Blockingqueue Introduction to sharing data between Java multithreading

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.