Introduction to Koa service throttling methods and koa practices

Source: Internet
Author: User

Introduction to Koa service throttling methods and koa practices

I recently met a requirement. It is very simple to start a server. When receiving the request, I call a provided interface and return the result. Because of the performance problem of this interface and the number of requests cannot exceed a specific number, throttling is required in the service.

Throttling requires that the number of concurrent executions be limited. If the number is exceeded, cache is performed in a queue.

Koa middleware does not call next

The initial idea was to count in the koa middleware and cache the next function when there were more than six. When the ongoing task ends, call next to continue other requests.

Later, it was found that in koa middleware, the request without executing the next function would not stop, but instead would not call the later middleware and directly return the content.

const Koa = require('koa');const app = new Koa();app.use((ctx, next) => { console.log('middleware 1'); setTimeout(() => {  next(); }, 3000); ctx.body = 'hello';});app.use((ctx, next) => { console.log('middleware 2');});app.listen(8989);

The above code first prints 'middleware 1' => the browser receives 'hello' => the console prints 'middleware 2 '.

Note that after a request is completed, the next method can still be called, the later middleware continues to run (but the modification to ctx does not take effect because the request has already been returned ). Similarly, close is also the same.

Use await to wait for the request

Delayed next function execution cannot be achieved. The next thing that comes to mind is to use await to wait for the current request. The await function returns a Promise. we store the resolve function in this Promise in the queue for delayed calls.

const Koa = require('koa');const app = new Koa();const queue = [];app.use(async (ctx, next) => { setTimeout(() => {  queue.shift()(); }, 3000); await delay(); ctx.body = 'hello';});function delay() { return new Promise((resolve, reject) => {  queue.push(resolve); });}app.listen(8989);

In the above Code, a Promise and Promise resolve function is returned in the delay function and saved to the queue. After the time is set to 3 seconds, the resolve function in the queue is executed, so that the request continues to be executed.

Is throttling for a route or a request throttling?

After the basic principle of Throttling is implemented, where should the throttling code be written? Basically, there are two locations:

Throttling for interfaces

In our demand, the throttling is due to the limited performance of the Request interface. Therefore, we can throttling this request separately:

Async function requestSomeApi () {// if the maximum concurrency is exceeded, if (counter> maxAllowedRequest) {await delay () ;}counter ++; const result = await request ('HTTP: // some. api '); counter --; queue. shift (); return result ;}

The following is a version that is easy to reuse.

async function limitWrapper(func, maxAllowedRequest) { const queue = []; const counter = 0; return async function () {  if (counter > maxAllowedRequest) {   await new Promise((resolve, reject) => {    queue.push(resolve);   });  }  counter++;  const result = await func();  counter--;  queue.shift()();  return result; }}

Throttling for routes

This method is to write a koa middleware and throttling In the middleware:

Async function limiter (ctx, next) =>{// if the maximum number of concurrent tasks is exceeded if (counter> = maxAllowedRequest) {// if the current queue is too long await new Promise (resolve, reject) =>{ queue. push (resolve) ;});} store. counter ++; await next (); store. counter --; queue. shift ()();};

Then we can use this middleware in the router for different routes:

router.use('/api', rateLimiter);

Comparison

When Throttling is implemented for the interface and the logic is a little messy, the traffic is throttled Based on the route. All operations are perfect.

Until I received another request, it was to request the result array of the three requests returned by this interface three times. Now the problem arises. We cannot directly call the interface because Throttling is required. Nor can we directly call the function of the Request interface because our Throttling is based on routing. What should we do? We only need to request this route and request ourselves...

Notes

Listen to the close event and remove the request from the queue
Requests that have been stored in the queue may be canceled by users. As mentioned above, even if the request is canceled in koa, the middleware will still run, that is, the interface requiring throttling will be executed, resulting in waste.

You can listen to the close event to achieve this goal. Each request must be marked with a hash value:

ctx.res.on('close', () => { const index = queue.findIndex(item => item.hash === hash); if (index > -1) {  queue.splice(index, 1); }});

Set timeout

To prevent users from waiting for too long, you need to set the timeout time, which is easily implemented in koa:

const server = app.listen(config.port);server.timeout = DEFAULT_TIMEOUT;

The queue is too long.

If the current queue is too long, it times out even if it is added to the queue. Therefore, we also need to handle the situation where the queue is too long:

if (queue.length > maxAllowedRequest) { ctx.body = 'error message'; return;}

The above is all the content of this article. I hope it will be helpful for your learning and support for helping customers.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.