Motan the thread pool used to process the request on the service provider side

Source: Internet
Author: User

Recently began to re-look at the source of Motan, intends to spend a year to analyze each module of the code implementation of each function, insist on writing a Motan analysis series.

Because there is no train of thought, only a fragment of a fragment of the look, wait until there is a certain accumulation, and then read the code string together to analyze, forming a complete train of thought.

The first one is to answer your own question, what is the threading model for Motan on the provider side of the service? After the request arrives at the service provider end, which thread is processed?

The Motan TCP communication framework uses the Netty,netty threading model as the reactor model. A acceptor thread is responsible for the channel access, which is then handed to the reactor thread for the channel's read and write events.

So a request arrives at the provider end, which is first handled by the reactor thread, decoded, and decoded into the Java request object. This time there is a problem, generally our services are to access the database and other resources, there will be an IO blocking, if we directly in the reactor thread processing request requests, will block the reactor thread, so that reactor can not handle other channel read and write events , there is no way to achieve high concurrency. So we use a thread pool to process the request.

Motan's Code:

public class Nettychannelhandler extends Simplechannelhandler {private Threadpoolexecutor threadpoolexecutor;.. private void ProcessRequest (final Channelhandlercontext ctx, messageevent e) {Final request request = (Request) E.getmessa GE (); request.setattachment (URLParamType.host.getName (), Netutils.gethostname (Ctx.getchannel (). Getremoteaddress ( )); final long processstarttime = System.currenttimemillis ();//Use thread pool mode to process try {threadpoolexecutor.execute (new Runnable                () {@Override public void run () {ProcessRequest (CTX, request, processstarttime); }            });} catch (Rejectedexecutionexception rejectexception) {defaultresponse response = new Defaultresponse (); Response.setrequestid (Request.getrequestid ()); Response.setexception (New Motanserviceexception ("Process thread Pool is full, reject ", motanerrormsgconstant.service_reject)); Response.setprocesstime (System.currenttimemillis ()- Processstarttime); E.getchannel (). write (response); Loggerutil.dEbug ("Process thread pool is full, reject, active={} poolsize={} corepoolsize={} maxpoolsize={} taskcount={} requestid={} ", Threadpoolexecutor.getactivecount (), Threadpoolexecutor.getpoolsize (), Threadpoolexecutor.getcorepoolsize (), Threadpoolexecutor.getmaximumpoolsize (), Threadpoolexecutor.gettaskcount (), Request.getrequestid ());}}

The process is this:

Motan did not directly use the JDK thread while processing the request, but inherited threadpoolexecutor for a custom implementation.

Implementation code for the Standardthreadexecutor:

public class Standardthreadexecutor extends Threadpoolexecutor {public static final int default_min_threads = 20;public St atic final int default_max_threads = 200;public static final int default_max_idle_time = 60 * 1000; 1 minutesprotected Atomicinteger submittedtaskscount;//The number of tasks being processed private int maxsubmittedtaskcount;//Maximum number of simultaneous tasks allowed ... public standardthreadexecutor (int corethreads, int maxthreads, long keepalivetime, Timeunit unit,int Queuecapacity, Threadfactory threadfactory, Rejectedexecutionhandler handler) {super (corethreads, MaxThreads, KeepAliveTime, Unit, New Executorqueue (), threadfactory, handler);((Executorqueue) Getqueue ()). Setstandardthreadexecutor (this); submittedtaskscount = new Atomicinteger (0);//MAX Concurrent Task limit: Number of queue buffer + maximum number of threads Maxsubmittedtaskcount = queuecapacity + maxthreads; }public void Execute (Runnable command) {int count = Submittedtaskscount.incrementandget ();//exceeds the maximum concurrent task limit for reject// Dependent LinkedTransferQueue does not have a length limit, so here is the control if (Count > MaxsubmittedtaskcounT) {submittedtaskscount.decrementandget (); Getrejectedexecutionhandler (). Rejectedexecution (command, this);} try {super.execute (command);} catch (Rejectedexecutionexception Rx) {//There could have been contention around the Queuei F (!) ( (Executorqueue) Getqueue ()). Force (command)) {submittedtaskscount.decrementandget (); Getrejectedexecutionhandler () . rejectedexecution (Command, this);}}} public int Getsubmittedtaskscount () {return this.submittedTasksCount.get ();} public int Getmaxsubmittedtaskcount () {return maxsubmittedtaskcount;} protected void AfterExecute (Runnable R, Throwable t) {submittedtaskscount.decrementandget ();}}

Here is the Execute and AfterExecute method that rewrote the Threadpoolexecutor. One place to be very careful is the use of a executorqueue as a blockingqueue.

Let's look at Executorqueue's code implementation:

/** * LinkedTransferQueue can guarantee higher performance, compared with linkedblockingqueue significantly increased * * 1) But the disadvantage of linkedtransferqueue is that there is no queue length control, need to assist in the outer control */ Class Executorqueue extends Linkedtransferqueue<runnable> {private static final long Serialversionuid =- 265236426751004839L; Standardthreadexecutor threadpoolexecutor;public Executorqueue () {super ();} public void Setstandardthreadexecutor (Standardthreadexecutor threadpoolexecutor) {this.threadpoolexecutor = Threadpoolexecutor;} Note: The code originates from Tomcat public boolean force (Runnable o) {if (Threadpoolexecutor.isshutdown ()) {throw new Rejectedexecutionexce Ption ("Executor not running, can ' t force a command into the queue"); Forces the item onto the queue, to being used if the task is Rejectedreturn super.offer (o);} Note: The tomcat code makes some minor changes to the public boolean offer (Runnable o) {int poolsize = threadpoolexecutor.getpoolsize ();//We are maxed OUs T on threads, simply queue the objectif (poolsize = = Threadpoolexecutor.getmaximumpoolsize ()) {return Super.offer (o);} We have idle threads, jusT add it to the queue//note that we don ' t use Getactivecount (), see BZ 49730if (Threadpoolexecutor.getsubmittedtaskscount () <= poolsize) {return super.offer (o);} If we have less threads than maximum force creation of a new//threadif (Poolsize < Threadpoolexecutor.getmaximumpoo Lsize ()) {return false;} If we reached here, we need to add it to the Queuereturn Super.offer (o);}

Executorqueue implementation of LinkedTransferQueue, mainly linkedtransferqueue compared to linkedblockingqueue and other queues have a lot of performance improvement. Its disadvantage is that there is no queue length control, prone to memory overflow. So in Motan code, in execute (Runnable R), add one to the number of committed tasks, reduce the number of committed tasks in AfterExecute, maintain a running number of tasks, and have a limit of the maximum number of tasks. When the task is submitted, if the number of tasks exceeds the maximum number of tasks, the task is implemented with a deny policy, which enables queue length control.

In addition, the offer method is overridden in the implementation of Executorqueue because the offer method of Linkedblockingqueue always returns true, so the number of threads in the thread pool does not exceed minthread. The Motan change is to return false when the number of tasks submitted exceeds poolsize, and poolsize is less than the maximum number of tasks, allowing executor to create the thread.

Finally, two graphs are used to summarize the execution flow of the thread pool and the Motan thread pool of the JDK:

Motan the thread pool used to process the request on the service provider side

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.