From the art of Java concurrency programming
What is a blocking queue
The blocking queue (Blockingqueue) is a queue that supports two additional operations. Both of these additional operations support blocking insertion and removal methods.
- Blocking insertion method: means that when the queue is full, the queue blocks the thread that inserted the element until the queue is dissatisfied.
- Blocking removal method: means that the thread that gets the element waits for the queue to become non-empty when the queue is empty.
Blocking queues are often used for producer and consumer scenarios, where the producer is the thread that adds elements to the queue, and the consumer is the thread that takes the elements from the queue. A blocking queue is a container that the producer uses to hold the element and the consumer to get the element.
These two additional operations provide 4 ways to handle blocking queues when they are not available, as described in the following table:
4 ways to handle insert and remove operations
- Throw exception: When the queue is full, if you insert an element into the queue again, an illegalstateexception ("Queue full") exception is thrown. When the queue is empty, fetching elements from the queue throws an Nosuchelementexception exception.
- Returns a special value: Returns True if the element is inserted successfully when the element is inserted toward the queue. If the method is removed, an element is removed from the queue and null is returned if none.
- Always blocked: When the blocking queue is full, if the producer thread puts elements into the queue, the queue blocks the producer thread until the queue is available or the response is interrupted to exit. When the queue is empty, if the consumer thread is take element from the queue, the queue blocks the consumer thread until the queue is not empty.
- Timeout exit: When the blocking queue is full, if the producer thread inserts an element into the queue, the queue blocks the producer thread for a period of time, and if the specified time is exceeded, the producer thread exits.
blocking queues in Java
JDK 7 provides 7 blocking queues, as follows:
- Arrayblockingqueue: A bounded blocking queue consisting of an array structure.
- Linkedblockingqueue: A bounded blocking queue consisting of a list structure.
- Priorityblockingqueue: A unbounded blocking queue that supports priority ordering.
- Delayqueue: An unbounded blocking queue implemented with a priority queue.
- Synchronousqueue: A blocking queue that does not store elements.
- LinkedTransferQueue: An unbounded blocking queue consisting of a linked list structure.
- Linkedblockingdeque: A two-way blocking queue consisting of a linked list structure.
1. Arrayblockingqueue
Arrayblockingqueue is a bounded blocking queue implemented with arrays. This queue sorts the elements according to the principle of first-out (FIFO).
By default, thread-fair access queues are not guaranteed, and the so-called Fair access queue refers to blocked threads that can access the queue in the order of blocking, that is, blocking the line enters upgradeable access queue first. Unfairness is unfair to the waiting thread, and when the queue is available, the blocked thread can compete for access to the queue, possibly blocking the last thread to access the queue. To ensure fairness, throughput is usually reduced.
Fairness and injustice can be set when building a blocking queue. The fairness of Arrayblockingqueue is achieved through Reentrantlock.
public ArrayBlockingQueue(intboolean fair) { if0) thrownew IllegalArgumentException(); this.itemsnew Object[capacity]; new ReentrantLock(fair); notEmpty = lock.newCondition(); notFull = lock.newCondition();}
Linkedblockingqueue
Linkedblockingqueue is a bounded blocking queue implemented with a linked list. The default and maximum length for this queue is integer.max_value. This queue sorts the elements according to the FIFO principle.
Priorityblockingqueue
The priorityblockingqueue is a priority-enabled unbounded blocking queue. By default, elements are sorted in ascending order by nature. You can also customize the class implementation CompareTo () method to specify an element collation, or, when initializing Priorityblockingqueue, specify the construction parameter comparator to sort the elements. It is important to note that the order of the same priority elements is not guaranteed.
Delayqueue
The delayed is a unbounded blocking queue that supports delay-fetching elements. Queues are implemented using Priorityqueue. The elements in the queue must implement the delayed interface, and you can specify how long to get the current element from the queue when the element is created. Elements can be extracted from the queue only when the delay expires.
Delayqueue is very useful and can be used in the following scenarios. Delayqueue
- Design of the caching system: use a thread to iterate through the delayqueue, and once the element is available from Delayqueue, it indicates that the cache is valid.
- Scheduled task scheduling: Use Delayqueue to save the task and execution time that will be performed on the day, once the task has been obtained from delayqueue, such as Timerqueue is implemented using Delayqueue.
(1). How to implement the delayed interface
The elements of the Delayqueue queue must implement the delayed interface. We can refer to the implementation of the Scheduledfuturetask class in Scheduledthreadpoolexecutor, a total of three steps.
Synchronousqueue
Synchronousqueue is a blocking queue that does not store elements. Each put operation must wait for a take operation, or the element cannot continue to be added.
It supports a fair access queue. By default, the thread uses the non-fairness policy to access the queue. Use the following construction method to create a synchronousqueue for fair access, and if set to true, the waiting thread accesses the queue in FIFO order.
public SynchronousQueue(boolean fair) { newTransferQueuenewTransferStack();}
Synchronousqueue can be seen as a passer-by, responsible for passing the data of producer threads directly to the consumer thread. The queue itself does not store any elements and is ideal for transitive scenarios. Synchronousqueue throughput is higher than linkedblockingqueue and Arrayblockingqueue.
LinkedTransferQueue
LinkedTransferQueue is an unbounded blocking Transferqueue queue composed of linked list structures. Compared to other blocking queues, LinkedTransferQueue has more trytransfer and transfer methods.
(1). Transfer method
If there is currently a consumer waiting to receive elements (consumers use the Take () method or the poll () method with time limit), the transfer method can immediately transfer (transmit) to the consumer the element that the producer has passed in. If no consumer is waiting for an element to be received, the transfer method stores the element in the queue's tail node and waits until the element is consumed by the consumer before returning. The key code for the transfer method is as follows.
tryAppend(s, haveData);returnawaitMatch(s, pred, e, (how == TIMED), nanos);
The first line of code is trying to take the S node that holds the current element as the tail node. The second line of code is to let the CPU spin waiting for consumer spending elements. Because spin consumes the CPU, it spins a certain number of times and uses the Thread.yield () method to pause the currently executing thread and execute other threads.
(2). Trytransfer method
The Trytransfer method is used to test whether a producer's incoming elements can be passed directly to the consumer. Returns false if no consumer waits for the receiving element. The difference between the transfer method and the Trytransfer method is that the method returns immediately, regardless of whether the consumer receives it or not, and the transfer method must wait until the consumer consumes it to return.
For the Trytransfer (E e,long timeout,timeunit Unit) method with a time limit, an attempt was made to pass the element of the producer directly to the consumer, but if no consumer consumes the element, it waits for the specified time to return, and if the timeout has not consumed the element, Returns False if the element was consumed within the timeout period, which returns true.
Linkedblockingdeque
Linkedblockingdeque is a two-way blocking queue consisting of a linked list structure. The so-called bidirectional queue refers to the ability to insert and remove elements from both ends of the queue. Two-way queue because more than one operation queue of the entrance, in the multi-threaded simultaneously queued, also reduced half of the competition. Compared to other blocking queues, Linkedblockingdeque has many methods, such as AddFirst, AddLast, Offerfirst, Offerlast, Peekfirst, and Peeklast, to end the first word, which means inserting, Gets (peek) or removes the first element of a double-ended queue. A method that ends with the last word, which represents the insertion, acquisition, or removal of the final element of a double-ended queue. In addition, the Insert method add is equivalent to AddLast, and removing the method removes the Removefirst. But the Take method is equivalent to Takefirst, not knowing whether it is a JDK bug, or using a method with first and last suffixes to be clearer.
You can set the capacity to prevent excessive bloat when initializing the Linkedblockingdeque. In addition, the two-way blocking queue can be used in "work-stealing" mode.
How blocking queues are implemented
If the queue is empty, consumers will wait until the producer adds elements, how does the consumer know that the current queue has elements? If you were to design a blocking queue, how would you design, how to make efficient communication between producers and consumers? Let's start by looking at how the JDK is implemented.
- Using notification mode
The so-called notification mode is that when a producer fills a queue with elements that block the producer, when the consumer consumes an element in a queue, it notifies the producer that the current queue is available. By viewing the JDK source Discovery Arrayblockingqueue is implemented using condition, the code is as follows:
PublicArrayblockingqueue (intCapacity,BooleanFair) {//Omit part of code snippetNotempty = lock.newcondition(); Notfull = lock.newcondition();} Public void put(e)throwsinterruptedexception {Checknotnull(e);FinalReentrantlock lock = This.Lock; Lock.lockinterruptibly();Try{ while(count = = items.)length)//See hereNotfull.await();Enqueue(e); }finally{lock.Unlock(); }} PublicE Take()throwsinterruptedexception {FinalReentrantlock lock = This.Lock; Lock.lockinterruptibly();Try{ while(Count = =0) Notempty.await();return dequeue(); }finally{lock.Unlock(); }}
When inserting an element into a queue, if the queue is not available, the blocking producer primarily
Locksupport.park (this) to implement.
Public Final void await()throwsinterruptedexception {if(Thread.interrupted())Throw NewInterruptedexception (); Node node =Addconditionwaiter();intsavedstate =Fullyrelease(node);intInterruptmode =0; while(!Isonsyncqueue(node)) {Locksupport.Park( This);if((Interruptmode =checkinterruptwhilewaiting(node)) !=0) Break; }if(acquirequeued(node, savedstate) && interruptmode! = throw_ie) Interruptmode = Reinterrupt;if(node.Nextwaiter!=NULL)//Clean up if cancelled unlinkcancelledwaiters();if(Interruptmode! =0)reportinterruptafterwait(Interruptmode);}
Java Blocking Queue