A deep understanding of blocking queue containers in Java Thread Programming

Source: Internet
Author: User
This article mainly introduces the blocking queue container in Java thread programming, and introduces some basic methods for implementing blocking queues in JDK. For more information, see 1. What is a blocking queue?

BlockingQueue is a queue that supports two additional operations. The two additional operations are as follows: when the queue is empty, the thread that obtains the element will wait for the queue to become non-empty. When the queue is full, the thread storing elements will wait for the queue to be available. Blocking queues are often used in scenarios where producers and consumers add elements to queues, while consumers take elements from queues. The blocking queue is the container where the producer stores elements, and the consumer only takes elements from the container.

Blocking queues provide four solutions:

Throw exception: When the blocking Queue is full, an IllegalStateException ("Queue full") exception is thrown when an element is inserted into the Queue. When the queue is empty, NoSuchElementException is thrown when elements are obtained from the queue.
Return special value: whether the insertion method returns success or not, and true if the insertion method succeeds. The removal method is to extract an element from the queue. If not, null is returned.
Always blocking: When the blocking queue is full, if the producer thread put elements into the queue, the queue will always block the producer thread until the data is obtained or the response is interrupted and exited. When the queue is empty, the consumer thread tries to take elements from the queue, and the queue will also block the consumer thread until the queue is available.
Timeout Exit: When the blocking queue is full, the queue will block the producer thread for a period of time. If it exceeds a certain period of time, the producer thread will exit.
2. Blocking queues in Java

JDK 7 provides seven blocked queues. They are

  1. ArrayBlockingQueue: A Bounded blocking queue consisting of arrays.
  2. LinkedBlockingQueue: A Bounded blocking queue composed of linked lists.
  3. PriorityBlockingQueue: an unbounded blocking queue that supports priority sorting.
  4. DelayQueue: an unbounded blocking queue implemented using a priority queue.
  5. SynchronousQueue: a blocking queue that does not store elements.
  6. Unbounded transferqueue: an unbounded blocking queue composed of linked lists.
  7. LinkedBlockingDeque: a bidirectional blocking queue consisting of a linked list.

ArrayBlockingQueue is a bounded blocking queue implemented by arrays. This queue sorts the elements according to the FIFO principle. By default, fair access queue does not guarantee fair access queues for visitors. The so-called fair access queue refers to all the blocked producer threads or consumer threads. When the queue is available, the queue can be accessed in the blocked order, that is, the producer thread that is blocked first can insert elements into the queue first, and the consumer thread that is blocked first can obtain elements from the queue first. Generally, throughput is reduced to ensure fairness. We can use the following code to create a fair blocking queue:

ArrayBlockingQueue fairQueue = new ArrayBlockingQueue(1000,true);

The visitor's fairness is achieved using reentrant locks. The Code is as follows:

public ArrayBlockingQueue(int capacity, boolean fair) {    if (capacity <= 0)      throw new IllegalArgumentException();    this.items = new Object[capacity];    lock = new ReentrantLock(fair);    notEmpty = lock.newCondition();    notFull = lock.newCondition();}

LinkedBlockingQueue is a bounded blocking queue implemented by a linked list. The default and maximum length of this queue is Integer. MAX_VALUE. This queue sorts the elements according to the first-in-first-out principle.

PriorityBlockingQueue is an unbounded queue that supports priority. By default, elements are arranged in a natural order. You can also use the comparator to specify the sorting rules of elements. Elements are arranged in ascending order.

DelayQueue is an unbounded blocking queue that supports delayed element acquisition. The queue is implemented using PriorityQueue. The elements in the queue must implement the Delayed interface. When creating an element, you can specify how long it will take to obtain the current element from the queue. Elements can be extracted from the queue only when the delay expires. We can use DelayQueue in the following scenarios:

Cache System Design: You can use DelayQueue to save the validity period of cache elements and use a thread to query DelayQueue cyclically. Once an element is obtained from DelayQueue, the cache validity period is reached.
Scheduled tasks. Use DelayQueue to save the tasks that will be executed on the current day and the execution time. Once the task is obtained from DelayQueue, the execution starts. For example, TimerQueue is implemented using DelayQueue.
The Delayed in the queue must implement compareTo to specify the order of elements. For example, put the longest latency at the end of the queue. The implementation code is as follows:

public int compareTo(Delayed other) {      if (other == this) // compare zero ONLY if same object        return 0;      if (other instanceof ScheduledFutureTask) {        ScheduledFutureTask x = (ScheduledFutureTask)other;        long diff = time - x.time;        if (diff < 0)          return -1;        else if (diff > 0)          return 1;  else if (sequenceNumber < x.sequenceNumber)          return -1;        else          return 1;      }      long d = (getDelay(TimeUnit.NANOSECONDS) -           other.getDelay(TimeUnit.NANOSECONDS));      return (d == 0) ? 0 : ((d < 0) ? -1 : 1);    }

3. How to Implement the Delayed Interface

We can refer to the ScheduledFutureTask class in ScheduledThreadPoolExecutor. This class implements the Delayed interface. First, when can an object be used before time is recorded when an object is created? The Code is as follows:


ScheduledFutureTask(Runnable r, V result, long ns, long period) {      super(r, result);      this.time = ns;      this.period = period;      this.sequenceNumber = sequencer.getAndIncrement();}

Then, you can use getDelay to query the delay of the current element. The Code is as follows:

public long getDelay(TimeUnit unit) {      return unit.convert(time - now(), TimeUnit.NANOSECONDS);    }

Through the constructor, we can see that the unit of the delay time parameter ns is nanoseconds. It is best to use nanoseconds when designing the parameter, because any unit can be specified during getDelay. Once the unit is nanoseconds, the latency is less accurate than the nanosecond, which makes it troublesome. Note that when time is less than the current time, getDelay returns a negative number.

4. How to Implement delayed queue

The implementation of delayed queue is very simple. When the consumer obtains elements from the queue, if the elements do not reach the delay time, the current thread is blocked.

long delay = first.getDelay(TimeUnit.NANOSECONDS);          if (delay <= 0)            return q.poll();          else if (leader != null)            available.await();

SynchronousQueue is a blocking queue that does not store elements. Each put operation must wait for a take operation; otherwise, the element cannot be added. SynchronousQueue can be viewed as a passer who directly transmits the data processed by the producer thread to the consumer thread. The queue itself does not store any elements, and is very suitable for transmission scenarios. For example, the data used in one thread is transferred to another thread for use. The throughput of SynchronousQueue is higher than that of LinkedBlockingQueue and ArrayBlockingQueue.

LinkedTransferQueue is an unbounded blocking TransferQueue queue composed of linked lists. Compared with other blocked queues, the required transferqueue has the tryTransfer and transfer methods.

The transfer method. If a consumer is waiting for receiving elements (when the consumer uses the take () method or the poll () method with time constraints), the transfer method can immediately transfer the elements transmitted by the producer) to the consumer. If no consumer is waiting for the receiving element, the transfer method stores the element in the tail node of the queue and returns the element after it is consumed by the consumer. The key code of the transfer method is as follows:

Node pred = tryAppend(s, haveData);return awaitMatch(s, pred, e, (how == TIMED), nanos);

The first line of code is to use the s node that stores the current element as the tail node. The second line of code is to let the CPU spin wait for the consumer to consume elements. Because the spin consumes CPU, the Thread. yield () method is used to pause the Thread currently being executed and execute other threads after a certain number of times.

TryTransfer method. It is used to test whether the elements imported by the producer can be directly transmitted to the consumer. If no consumer waits for receiving elements, false is returned. The difference between the method and the transfer method is that the tryTransfer method returns immediately regardless of whether the consumer receives the request. The transfer method is returned only when the consumer consumes it.

For the tryTransfer (E e, long timeout, TimeUnit unit) method with time constraints, it is an attempt to pass the elements imported by the producer directly to the consumer, however, if no consumer consumes this element, it will wait for the specified time and then return. If the time-out period does not consume the element, false is returned. If the element is consumed within the time-out period, true is returned.

LinkedBlockingDeque is a bidirectional blocking queue composed of linked lists. Two-way queue means you can insert and remove elements from both ends of the queue. Because a dual-end queue has more than one operation queue entry, half of the competition is reduced when multiple threads join the queue at the same time. Compared with other blocking queues, javasblockingdeque has addFirst, addLast, offerFirst, offerLast, peekFirst, peekLast, and other methods. The method ending with the First word indicates insertion and obtaining (peek) or remove the first element of the double-ended queue. The method ending with the Last word indicates insertion, obtaining or removing the Last element of the double-end queue. In addition, the insert Method add is equivalent to addLast, And the remove Method remove is equivalent to removeFirst. However, the take method is equivalent to the takeFirst method. I don't know if it is a Jdk bug. It is clearer to use a method with the First and Last suffixes.

When initializing blockingdeque, you can set the capacity to prevent its transitional expansion. In addition, two-way blocking queues can be used in the "Work theft" mode.

5. Implementation principle of blocking queues
This article takes ArrayBlockingQueue as an example. The implementation principle of other blocking queues may be different from that of ArrayBlockingQueue, but the general idea should be similar. If you are interested, you can view the source code of other blocking queues.

First, let's take a look at several member variables in the ArrayBlockingQueue class:

public class ArrayBlockingQueue
 
   extends AbstractQueue
  
   implements BlockingQueue
   
    , java.io.Serializable { private static final long serialVersionUID = -817911632652898426L; /** The queued items */private final E[] items;/** items index for next take, poll or remove */private int takeIndex;/** items index for next put, offer, or add. */private int putIndex;/** Number of items in the queue */private int count; /** Concurrency control uses the classic two-condition algorithm* found in any textbook.*/ /** Main lock guarding all access */private final ReentrantLock lock;/** Condition for waiting takes */private final Condition notEmpty;/** Condition for waiting puts */private final Condition notFull;}
   
  
 

It can be seen that what ArrayBlockingQueue uses to store elements is actually an array. takeIndex and putIndex indicate the subscript of the first element and the End Element respectively, and count indicate the number of elements in the queue.

Lock is A reentrant lock, and notEmpty and notFull are waiting conditions.

Let's take a look at the constructor of ArrayBlockingQueue. The constructor has three overloaded versions:

public ArrayBlockingQueue(int capacity) {}public ArrayBlockingQueue(int capacity, boolean fair) { }public ArrayBlockingQueue(int capacity, boolean fair,             Collection<? extends E> c) {}

The first constructor has only one parameter to specify the capacity, the second constructor can specify the capacity and fairness, and the third constructor can specify the capacity and fairness, and use another set for initialization.

Then let's look at the implementation of the two key methods: put () and take ():

public void put(E e) throws InterruptedException {  if (e == null) throw new NullPointerException();  final E[] items = this.items;  final ReentrantLock lock = this.lock;  lock.lockInterruptibly();  try {    try {      while (count == items.length)        notFull.await();    } catch (InterruptedException ie) {      notFull.signal(); // propagate to non-interrupted thread      throw ie;    }    insert(e);  } finally {    lock.unlock();  }}

From the implementation of the put method, we can see that it first obtains the lock and obtains the locks that can be interrupted, and then determines whether the number of current elements is equal to the length of the array. If they are equal, notFull is called. await () waits. If an interrupt exception is caught, the thread is awakened and an exception is thrown.

When awakened by other threads, insert (e) elements and unlock them.

Let's take a look at the implementation of the insert method:

private void insert(E x) {  items[putIndex] = x;  putIndex = inc(putIndex);  ++count;  notEmpty.signal();}

It is a private method. After successful insertion, the notEmpty method is used to wake up the thread waiting for element retrieval.

The implementation of the take () method is as follows:

public E take() throws InterruptedException {  final ReentrantLock lock = this.lock;  lock.lockInterruptibly();  try {    try {      while (count == 0)        notEmpty.await();    } catch (InterruptedException ie) {      notEmpty.signal(); // propagate to non-interrupted thread      throw ie;    }    E x = extract();    return x;  } finally {    lock.unlock();  }}


Similar to the put method, the put method only waits for the notFull signal, while the take method waits for the notEmpty signal. In the take method, if an element can be obtained, the element is obtained through the extract method. The following is the implementation of the extract method:


private E extract() {  final E[] items = this.items;  E x = items[takeIndex];  items[takeIndex] = null;  takeIndex = inc(takeIndex);  --count;  notFull.signal();  return x;}

Similar to the insert method.

In fact, we should understand the implementation principle of the blocking queue from here. The fact is that it uses Object. wait (), Object. the idea of Y () and non-blocking queues implementing producer-consumer is similar, but it integrates these tasks into blocking queues.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.