[Java Basics] Java Multithreading-Tool article-blockingqueue

Source: Internet
Author: User

Reprinted from: http://www.cnblogs.com/jackyuj/archive/2010/11/24/1886553.html

    • Objective:

In the new concurrent package, Blockingqueue solves the problem of how to efficiently and securely "transfer" data in multiple threads. These efficient and thread-safe queue classes provide great convenience for us to quickly build high-quality multi-threaded threads. This article details all the members of the Blockingqueue family, including their respective features and common usage scenarios.

  • Meet Blockingqueue
    A blocking queue, as the name implies, is first a queue, and a queue plays a role in the data structure roughly as shown:

    From what we can clearly see, through a Shared queue, you can make the data input from one end of the queue, and output from the other end;
    There are two commonly used queues: (of course, different implementations can also extend many different types of queues, Delayqueue is one of them)
    FIFO: The elements of the first inserted queue are also first out of the queue, similar to the queued functionality. To some extent, this kind of queue also embodies a kind of fairness.
    Last in, first Out (LIFO): The element that is inserted into the queue is first out of the queue, and this queue takes precedence over the most recent occurrence.

    In a multithreaded environment, data sharing can easily be achieved through queues, such as the classic "producer" and "consumer" models, where data sharing between the two can be conveniently achieved through queues. Let's say we have a number of producer threads, and another number of consumer threads. If the producer thread needs to share the prepared data to the consumer thread and use the queue to pass the data, it is easy to solve the data sharing problem between them. But what if the producer and consumer are in a certain time period, in case the data processing speed does not match? Ideally, if the producer produces data at a faster rate than the consumer, and when the resulting data accumulates to a certain extent, then the producer must pause (block the producer thread) to wait for the consumer thread to finish processing the accumulated data, and vice versa. However, before the concurrent package was released, in a multithreaded environment, each of our programmers had to take control of these details, especially in the context of efficiency and thread safety, which would bring a great degree of complexity to our programs. Fortunately at this time, the powerful concurrent package turned out, and he also brought us a strong blockingqueue. (In the Multithreaded realm: so-called blocking, in some cases hangs a thread (that is, blocked), and once the condition is met, the suspended thread is automatically awakened)
    The following two illustrations illustrate the two common blocking scenarios for Blockingqueue:
    as shown: when there is no data in the queue, all threads on the consumer side are automatically blocked (suspended) until there is data in the queue.


        as shown: When the queue is filled with data, all threads on the producer side are automatically blocked (suspended) until the queue has empty positions and the thread is automatically awakened.
    This is why we need to blockingqueue in a multi-threaded environment. As a blockingqueue user, we no longer need to care about when we need to block threads and when to wake up threads, because all of this blockingqueue you handedly. Since Blockingqueue is so well-known, let's take a glimpse of its common methods:
    the core approach of Blockingqueue
    Put data:
    Offer (AnObject): If possible, add AnObject to Blockingqueue, that is, if blockingqueue can accommodate,
    Returns TRUE, otherwise false is returned. (This method does not block the thread that is currently executing the method)
    Offer (E O, long timeout, timeunit unit), can set the waiting time, if within the specified time, not yet in the queue
    Join Blockingqueue, the return fails.
    Put (anobject): Add AnObject to Blockingqueue, and if Blockqueue has no space, the thread calling this method is blocked
    Until Blockingqueue there is room to continue.
    Get Data:
    Poll (time): Take the Blockingqueue in the first row of the object, if not immediately removed, you can wait for the timing of the parameters specified in
    Returns null when not taken;
    Poll (long timeout, timeunit unit): An object that takes the first team from Blockingqueue, if within a specified time,
    Once the queue has data, the data in the queue is returned immediately. Otherwise, the time-out is not well-timed and the return fails.
    Take (): Takes the Blockingqueue in the first place of the object, if the blockingqueue is empty, blocking into the waiting state until
    Blockingqueue has new data to be added;
    Drainto (): Get all available data objects from Blockingqueue at once (you can also specify the number of data to be fetched).
    This method can improve the efficiency of data acquisition, and does not need to lock or release multiple times in batches.
    • Common Blockingqueue
      After understanding the basic functions of blockingqueue, let's take a look at the Blockingqueue family members.
    • Blockingqueue members Detailed introduction
      1. Arrayblockingqueue

      Array-based blocking queue implementation, within Arrayblockingqueue, maintains a fixed-length array to cache data objects in the queue, which is a common blocking queue, in addition to a fixed-length array, the Arrayblockingqueue interior also holds two shaping variables, Identifies the position of the queue's head and tail in the array, respectively.
      Arrayblockingqueue the same lock object is shared between the producer and the consumer, which means that the two cannot actually run in parallel, which is particularly different from the linkedblockingqueue, and according to the principle of implementation, The arrayblockingqueue can be fully split-lock, enabling full parallel operation of both producer and consumer operations. Doug Lea did not do this, perhaps because Arrayblockingqueue's data write and fetch operations are lightweight enough to introduce an independent locking mechanism that, in addition to adding additional complexity to the code, is not a good performance benefit. One notable difference between Arrayblockingqueue and Linkedblockingqueue is that the former does not produce or destroy any additional object instances when inserting or deleting elements, while the latter generates an additional node object. This has a certain difference in the effect of GC on a system that needs to handle large quantities of data efficiently and concurrently over a long period of time. When creating Arrayblockingqueue, we can also control whether an object's internal lock is a fair lock, and an unfair lock is used by default.

      2. Linkedblockingqueue
      A linked list-based blocking queue, similar to Arraylistblockingqueue, maintains a data buffer queue (which is made up of a list of lists), and when the producer puts a data into the queue, the queue fetches the data from the producer and caches it inside the queue. The producer returns immediately; the producer queue is blocked until the queue buffer reaches the maximum cache capacity (Linkedblockingqueue can be specified by the constructor) until the consumer consumes a piece of data from the queue, and the producer thread is awakened. On the contrary, the consumer side of the processing is based on the same principle. While Linkedblockingqueue is able to efficiently handle concurrency data, it also uses separate locks for both producer and consumer to control data synchronization, which means that producers and consumers can operate the data in the queue in parallel with high concurrency, This improves the concurrency performance of the entire queue.
      As a developer, it is important to note that if you construct a Linkedblockingqueue object without specifying its capacity size, Linkedblockingqueue will default to a capacity (Integer.max_value) that is like an infinite size, In this case, if the producer's speed is greater than the consumer's speed, perhaps not until the queue is full of congestion, the system memory may have been exhausted.

      Arrayblockingqueue and Linkedblockingqueue are the two most common and most commonly used blocking queues, and in general, in dealing with producer consumer issues between multiple threads, use these two classes enough.

      The following code demonstrates how to use the Blockingqueue:
    • ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.ExecutorService;Importjava.util.concurrent.Executors;ImportJava.util.concurrent.LinkedBlockingQueue;/** * @authorJackyuj*/ Public classBlockingqueuetest { Public Static voidMain (string[] args)throwsinterruptedexception {//declaring a cache queue with a capacity of 10blockingqueue<string> queue =NewLinkedblockingqueue<string> (10); Producer Producer1=NewProducer (queue); Producer Producer2=NewProducer (queue); Producer Producer3=NewProducer (queue); Consumer Consumer=NewConsumer (queue); //with executorsExecutorservice Service =Executors.newcachedthreadpool (); //Start ThreadService.execute (Producer1);        Service.execute (PRODUCER2);        Service.execute (PRODUCER3);         Service.execute (consumer); //Perform 10sThread.Sleep (10 * 1000);        Producer1.stop ();        Producer2.stop ();         Producer3.stop (); Thread.Sleep (2000); //Exit ExecutorService.shutdown (); }}

    • ImportJava.util.Random;ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.TimeUnit;/*** Consumer Thread * *@authorJackyuj*/ Public classConsumerImplementsRunnable { PublicConsumer (blockingqueue<string>queue) {         This. Queue =queue; }      Public voidrun () {System.out.println ("Start the consumer thread!" "); Random R=NewRandom (); BooleanIsRunning =true; Try {             while(isrunning) {System.out.println ("Getting data from Queue ..."); String Data= Queue.poll (2, Timeunit.seconds); if(NULL!=data) {System.out.println ("Get the data:" +data); System.out.println ("Consuming data:" +data);                Thread.Sleep (R.nextint (default_range_for_sleep)); } Else {                    //more than 2s has no data, that all production lines have been exited, automatically exit the consumer thread. IsRunning =false; }            }        } Catch(interruptedexception e) {e.printstacktrace ();        Thread.CurrentThread (). interrupt (); } finally{System.out.println ("Quit the consumer thread!" "); }    }     PrivateBlockingqueue<string>queue; Private Static Final intDefault_range_for_sleep = 1000;}

    • ImportJava.util.Random;ImportJava.util.concurrent.BlockingQueue;ImportJava.util.concurrent.TimeUnit;ImportJava.util.concurrent.atomic.AtomicInteger;/*** Producer Thread * *@authorJackyuj*/ Public classProducerImplementsRunnable { PublicProducer (blockingqueue queue) { This. Queue =queue; }      Public voidrun () {String data=NULL; Random R=NewRandom (); System.out.println ("Start the producer thread!" "); Try {             while(isrunning) {System.out.println ("Producing data ...");                 Thread.Sleep (R.nextint (default_range_for_sleep)); Data= "Data:" +Count.incrementandget (); System.out.println ("Add Data:" + + data + "into queue ..."); if(!queue.offer (data, 2, Timeunit.seconds)) {System.out.println ("Failed to put data:" +data); }            }        } Catch(interruptedexception e) {e.printstacktrace ();        Thread.CurrentThread (). interrupt (); } finally{System.out.println ("Quit the producer thread!" "); }    }      Public voidStop () {isrunning=false; }     Private volatile BooleanIsRunning =true; Privateblockingqueue queue; Private StaticAtomicinteger count =NewAtomicinteger (); Private Static Final intDefault_range_for_sleep = 1000; }
      • 3. Delayqueue
        The element in the Delayqueue can be fetched from the queue only if its specified delay time is reached. Delayqueue is a queue with no size limit, so the operations (producers) that insert data into the queue are never blocked, and only the operations (consumers) that get the data are blocked.
        Usage scenarios:
        Delayqueue use fewer scenarios, but they are quite ingenious, and common examples include using a delayqueue to manage a connection queue that is not responding to timeouts.

        4. Priorityblockingqueue
        Priority-based blocking queues (priority judgments are determined by the Compator objects passed into the constructor), but it is important to note that Priorityblockingqueue does not block data producers, but only the consumers who block the data when there is no consumable data. It is therefore important to pay particular attention to the fact that the producer produces data at a speed that is not faster than the consumer's consumption of data, or that it will eventually deplete all available heap memory space for a long time. When implementing Priorityblockingqueue, the internal control thread synchronizes the lock with a fair lock.

        5. Synchronousqueue
        A buffer-free waiting queue, similar to a non-intermediary direct transaction, a bit like a primitive society of producers and consumers, producers take products to the market to sell to the final consumer products, and consumers must go to the market to find the direct producers of goods, if one side did not find the right target, then I am sorry, Everyone is waiting in the market. Compared to the buffer blockingqueue, there is a link between the intermediate dealers (buffer), if there is a dealer, the producers directly wholesale the product to the dealer, regardless of the dealer will eventually sell these products to those consumers, because the dealer can stock a part of the goods, Therefore, compared to the direct trading model, the overall use of intermediate dealer mode will be higher throughput (can be sold in bulk); On the other hand, because of the introduction of the dealer, the product from the producer to the consumer to add additional trading links, the performance of a single product can be reduced in time.
        There are two different ways to declare a synchronousqueue, and they behave differently. The difference between fair mode and non-equity mode:
        If the use of fair mode: Synchronousqueue will be a fair lock, and with a FIFO queue to block redundant producers and consumers, so that the overall system of fairness strategy;
        But if the non-fair mode (synchronousqueue default): Synchronousqueue with an unfair lock, with a LIFO queue to manage the surplus producers and consumers, the latter model, if the producers and consumers have a gap in processing speed, is prone to hunger and thirst, where data from certain producers or consumers may never be processed.
          • Summary
            Blockingqueue not only realized the basic functions of a complete queue, but also managed to automatically wait for wake-up between multiple lines in a multithreaded environment, allowing programmers to ignore these details and focus on more advanced features.

[Java Basics] Java Multithreading-Tool article-blockingqueue

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.