[Java Basics] Java Multithreading-Tool article-blockingqueue

Source: Internet
Author: User

Reprinted from: http://www.cnblogs.com/jackyuj/archive/2010/11/24/1886553.html

    • Objective:

In the new concurrent package, Blockingqueue solves the problem of how to efficiently and securely "transfer" data in multiple threads. These efficient and thread-safe queue classes provide great convenience for us to quickly build high-quality multi-threaded threads. This article details all the members of the Blockingqueue family, including their respective features and common usage scenarios.

  • Meet BlockingqueueThe blocking queue, as the name implies, first it is a queue, and a queue in the data structure in the role of roughly as shown: from We can clearly see, through a Shared queue, you can make the data from one end of the queue input, from the other end of the output; commonly used queues mainly have the following two types: (of course, through different implementations , there are many different types of queues that can be extended, Delayqueue is one of them) FIFO: The elements of the first inserted queue are also first out of the queue, similar to the queued functionality.   To some extent, this kind of queue also embodies a kind of fairness. Last in, first Out (LIFO): The element that is inserted into the queue is first out of the queue, and this queue takes precedence over the most recent occurrence.
    In a multithreaded environment, data sharing can easily be achieved through queues, such as the classic "producer" and "consumer" models, where data sharing between the two can be conveniently achieved through queues. Let's say we have a number of producer threads, and another number of consumer threads. If the producer thread needs to share the prepared data to the consumer thread and use the queue to pass the data, it is easy to solve the data sharing problem between them. But what if the producer and consumer are in a certain time period, in case the data processing speed does not match? Ideally, if the producer produces data at a faster rate than the consumer, and when the resulting data accumulates to a certain extent, then the producer must pause (block the producer thread) to wait for the consumer thread to finish processing the accumulated data, and vice versa. However, before the concurrent package was released, in a multithreaded environment, each of our programmers had to take control of these details, especially in the context of efficiency and thread safety, which would bring a great degree of complexity to our programs. Fortunately at this time, the powerful concurrent package turned out, and he also brought us a strong blockingqueue. (In the Multithreaded realm: so-called blocking, in some cases hangs a thread (that is, blocking), and once the condition is met, the suspended thread is automatically awakened) the following two illustrations illustrate the two common blocking scenarios for Blockingqueue:as shown: when there is no data in the queue, all threads on the consumer side are automatically blocked (suspended) until there is data in the queue.
        as shown: When the queue is filled with data, all threads on the producer side are automatically blocked (suspended) until the queue has empty positions and the thread is automatically awakened. This is why we need to blockingqueue in a multi-threaded environment. As a blockingqueue user, we no longer need to care about when we need to block threads and when to wake up threads, because all of this blockingqueue you handedly. Since Blockingqueue is so well-known, let's take a glimpse of its common methods:the core approach of Blockingqueue: Put data: Offer (AnObject): Indicates that if possible, add AnObject to Blockingqueue, that is, if Blockingqueue can accommodate, returns True, Otherwise, false is returned. (This method does not block the thread that is currently executing the method) offer (E O, long timeout, timeunit unit), you can set the time to wait, and if you cannot add blockingqueue to the queue within a specified time, the return loss   Defeat. Put (anobject): Add AnObject to Blockingqueue, and if Blockqueue has no space, the thread calling this method is blocked until there is space in the blockingqueue to continue.   Get Data: Poll (time): Take the Blockingqueue in the first row of the object, if not immediately remove, you can wait for the times specified by the parameters, and return null when not taken; Poll (long timeout, timeunit unit): Removes the first line object from Blockingqueue, and returns the data in the queue as soon as the queue has data that is desirable for the specified time.   Otherwise, the time-out is not well-timed and the return fails.    Take (): Taking the Blockingqueue in the first row of the object, if the blockingqueue is empty, blocking into the waiting state until Blockingqueue has new data to be added; Drainto (): Get all available data objects from Blockingqueue at once (you can also specify the number of data to get), this method can improve the efficiency of data acquisition, and do not need to lock or release locks multiple times in batches.
    • Common Blockingqueue after understanding the basic functions of blockingqueue, let's take a look at the general members of the Blockingqueue family?
    • Blockingqueue members Detailed introduction 1. ArrayblockingqueueArray-based blocking queue implementation, within Arrayblockingqueue, maintains a fixed-length array to cache data objects in the queue, which is a common blocking queue, in addition to a fixed-length array, the Arrayblockingqueue interior also holds two shaping variables,   Identifies the position of the queue's head and tail in the array, respectively. Arrayblockingqueue the same lock object is shared between the producer and the consumer, which means that the two cannot actually run in parallel, which is particularly different from the linkedblockingqueue, and according to the principle of implementation, The arrayblockingqueue can be fully split-lock, enabling full parallel operation of both producer and consumer operations. Doug Lea did not do this, perhaps because Arrayblockingqueue's data write and fetch operations are lightweight enough to introduce an independent locking mechanism that, in addition to adding additional complexity to the code, is not a good performance benefit. One notable difference between Arrayblockingqueue and Linkedblockingqueue is that the former does not produce or destroy any additional object instances when inserting or deleting elements, while the latter generates an additional node object. This has a certain difference in the effect of GC on a system that needs to handle large quantities of data efficiently and concurrently over a long period of time. When creating Arrayblockingqueue, we can also control whether an object's internal lock is a fair lock, and an unfair lock is used by default.
      2. LinkedblockingqueueA linked list-based blocking queue, similar to Arraylistblockingqueue, maintains a data buffer queue (which is made up of a list of lists), and when the producer puts a data into the queue, the queue fetches the data from the producer and caches it inside the queue. The producer returns immediately; the producer queue is blocked until the queue buffer reaches the maximum cache capacity (Linkedblockingqueue can be specified by the constructor) until the consumer consumes a piece of data from the queue, and the producer thread is awakened. On the contrary, the consumer side of the processing is based on the same principle. While Linkedblockingqueue is able to efficiently handle concurrency data, it also uses separate locks for both producer and consumer to control data synchronization, which means that producers and consumers can operate the data in the queue in parallel with high concurrency, This improves the concurrency performance of the entire queue. As a developer, it is important to note that if you construct a Linkedblockingqueue object without specifying its capacity size, Linkedblockingqueue will default to a capacity (Integer.max_value) that is like an infinite size, In this case, if the producer's speed is greater than the consumer's speed, perhaps not until the queue is full of congestion, the system memory may have been exhausted.
      Arrayblockingqueue and Linkedblockingqueue are the two most common and most commonly used blocking queues, and in general, in dealing with producer consumer issues between multiple threads, use these two classes enough.
      The following code demonstrates how to use the Blockingqueue:
    • ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.ExecutorService;ImportJava.util.concurrent.Executors;ImportJava.util.concurrent.LinkedBlockingQueue;/***@authorJackyuj*/PublicClassblockingqueuetest {PublicStaticvoid Main (string[] args)Throwsinterruptedexception {//Declares a cache queue with a capacity of 10 blockingqueue<string> queue =New Linkedblockingqueue<string> (10); Producer Producer1 =NewProducer (queue); Producer Producer2 =NewProducer (queue); Producer Producer3 =New Producer (queue); Consumer Consumer = new Consumer (queue); //// start thread // Execute 10s thread.sleep (10 * 1000 ); Producer1.stop (); Producer2.stop (); Producer3.stop (); Thread.Sleep (2000// exit Executor

    • ImportJava.util.Random;ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.TimeUnit;/*** Consumer Thread * *@authorJackyuj*/PublicClass ConsumerImplementsRunnable {Public Consumer (blockingqueue<string>Queue) {This.queue =Queue }PublicvoidRun () {System.out.println ("Start the consumer thread! "); Random r =NewRandom ();Boolean isrunning =True;Try{While(isrunning) {SYSTEM.OUT.PRINTLN ("Fetching data from Queue ..."); String data = Queue.poll (2, timeunit.seconds);if (Null! =Data) {System.out.println ("Get the Information:" +data); SYSTEM.OUT.PRINTLN ("Consuming data:" +data); Thread.Sleep (R.nextint (default_range_for_sleep)); }else {// more than 2s has no data, that all production lines have been exited, automatically exit the consumer thread. isrunning = false;}} catch (Interruptedexception e) {e.printstacktrace (); Thread.CurrentThread (). interrupt (); } finally {System.out.println ("Exit the consumer thread! ");}} private blockingqueue<string> queue; private static final        

    • ImportJava.util.Random;ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.TimeUnit;ImportJava.util.concurrent.atomic.AtomicInteger;/*** Producer Thread * *@authorJackyuj*/PublicClass ProducerImplementsRunnable {PublicProducer (Blockingqueue queue) {This.queue =Queue }PublicvoidRun () {String data =Null; Random r =NewRandom (); System.out.println ("Start the producer thread!") ");Try{While(isrunning) {System.out.println ("Production data ..."); Thread.Sleep (R.nextint (default_range_for_sleep)); data = "Data:" +Count.incrementandget (); SYSTEM.OUT.PRINTLN ("Add Data:" + Data + "into queue ...");if (!queue.offer (data, 2, timeunit.seconds)) {System.out.println ("Put data failed:" +data); } } }Catch(Interruptedexception e) {E.printstacktrace (); Thread.CurrentThread (). interrupt (); }Finally{System.out.println ("Exit producer Thread! "); } }Publicvoid Stop () {isrunning = falseprivate volatile  Boolean isrunning = trueprivate Blockingqueue queue; private static atomicinteger count = new Atomicinteger (); private static final Span style= "color: #0000ff;" >int default_range_for_sleep = 1000       
      • 3. DelayqueueThe element in the Delayqueue can be fetched from the queue only if its specified delay time is reached. Delayqueue is a queue with no size limit, so the operations (producers) that insert data into the queue are never blocked, and only the operations (consumers) that get the data are blocked. Usage Scenario: Delayqueue uses fewer scenes, but is quite ingenious, and common examples include using a delayqueue to manage a queue of connections that are not responding in time-outs.
        4. PriorityblockingqueuePriority-based blocking queues (priority judgments are determined by the Compator objects passed into the constructor), but it is important to note that Priorityblockingqueue does not block data producers, but only the consumers who block the data when there is no consumable data. It is therefore important to pay particular attention to the fact that the producer produces data at a speed that is not faster than the consumer's consumption of data, or that it will eventually deplete all available heap memory space for a long time. When implementing Priorityblockingqueue, the internal control thread synchronizes the lock with a fair lock.
        5. SynchronousqueueA buffer-free waiting queue, similar to a non-intermediary direct transaction, a bit like a primitive society of producers and consumers, producers take products to the market to sell to the final consumer products, and consumers must go to the market to find the direct producers of goods, if one side did not find the right target, then I am sorry, Everyone is waiting in the market. Compared to the buffer blockingqueue, there is a link between the intermediate dealers (buffer), if there is a dealer, the producers directly wholesale the product to the dealer, regardless of the dealer will eventually sell these products to those consumers, because the dealer can stock a part of the goods,   Therefore, compared to the direct trading model, the overall use of intermediate dealer mode will be higher throughput (can be sold in bulk); On the other hand, because of the introduction of the dealer, the product from the producer to the consumer to add additional trading links, the performance of a single product can be reduced in time. There are two different ways to declare a synchronousqueue, and they behave differently.   The difference between fair mode and unfair mode: If fair mode is adopted: Synchronousqueue will adopt fair lock, and cooperate with a FIFO queue to block redundant producers and consumers, and thus the fairness strategy of the whole system; But if the non-fair mode (synchronousqueue default): Synchronousqueue with an unfair lock, with a LIFO queue to manage the surplus producers and consumers, the latter model, if the producers and consumers have a gap in processing speed, is prone to hunger and thirst, where data from certain producers or consumers may never be processed.
          • Summary Blockingqueue not only realizes the basic functions of a complete queue, but also automatically manages multi-line standby wake-up functions in a multithreaded environment, allowing programmers to ignore these details and focus on more advanced features.

[Java Basics] Java Multithreading-Tool article-blockingqueue

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.