RateLimit-Example code of interface throttling using guava, ratelimit-guava
This article focuses on RateLimit-using guava for interface throttling.
I. Problem Description
One day, Mr. A suddenly found that his Interface request volume suddenly rose to 10 times of the previous one. Not long ago, the interface was almost unavailable and triggered A chain reaction, causing the entire system to crash. How can this problem be solved? Life has given us the answer: for example, old-fashioned electric locks are installed with fuses. Once someone uses ultra-high-power equipment, fuses will be burned out to protect all electrical appliances from burning out by strong current. Similarly, our interfaces also need to be installed with fuses to prevent unexpected requests from causing system paralysis due to excessive system pressure. When the traffic is too high, A denial or redirection mechanism can be adopted.
Ii. Common traffic limiting Algorithms
There are two common traffic limiting algorithms: the bucket leakage algorithm and the token bucket algorithm.
The idea of the bucket leakage algorithm is very simple. The request is first sent to the bucket that leaks water at a certain speed. When the water request passes through the conference, it directly overflows, it can be seen that the bucket leakage algorithm can effectively limit the data transmission rate.
Figure 1 bucket leakage Algorithm
In many application scenarios, in addition to limiting the average data transmission rate, it is also required to allow burst transmission to some extent. At this time, the bucket leakage algorithm may not be suitable, and the token bucket algorithm is more suitable. 2. The principle of the token bucket algorithm is that the system will put the token into the bucket at a constant speed. If the request needs to be processed, you must first obtain a token from the bucket, if no token is available in the bucket, the service is denied.
Figure 2 token bucket Algorithm
Iii. throttling tool RateLimiter
Google open-source toolkit guava provides the throttling tool class RateLimiter, which is based on the "token bucket algorithm" and is very convenient to use. For more information about how to use this type of interface, see RateLimiter usage.
RateLimiter Demo
Package ratelimite; import com. google. common. util. concurrent. rateLimiter; public class RateLimiterDemo {public static void main (String [] args) {testNoRateLimiter (); testWithRateLimiter ();} public static void testNoRateLimiter () {long start = System. currentTimeMillis (); for (int I = 0; I <10; I ++) {System. out. println ("call execute .. "+ I);} long end = System. currentTimeMillis (); System. out. println (end-start);} public static void testWithRateLimiter () {long start = System. currentTimeMillis (); RateLimiter limiter = RateLimiter. create (10.0); // up to 10 tasks per second are submitted for (int I = 0; I <10; I ++) {limiter. acquire (); // request RateLimiter. If the value exceeds permits, the System will be blocked. out. println ("call execute .. "+ I);} long end = System. currentTimeMillis (); System. out. println (end-start );}}
Four Guava concurrency: ListenableFuture and RateLimiter examples
Concept
ListenableFuture, as its name implies, is the Future that can be monitored. It is an extension of java Native Future. We know that Future represents an asynchronous computing task, and the computing result can be obtained when the task is completed. If we want to get the results displayed to the user or perform other calculations, we must use another thread to continuously query the computing status. In this way, the code is complex and inefficient. Use ListenableFuture Guava to check whether Future is complete. If so, the callback function is automatically called to reduce the complexity of the concurrent program.
The second method is recommended, because the second method can directly obtain the return value of Future, or handle errors. In essence, the second method is implemented by the first method, which is further encapsulated.
In addition, ListenableFuture has the following built-in implementations:
SettableFuture: you do not need to implement a method to calculate the return value. Instead, you only need to return a fixed value as the return value. You can set the return value or exception information of this Future by using a program.
CheckedFuture: This is an interface inherited from ListenableFuture. It provides the checkedGet () method. This method throws an exception of the specified type when the Future runs.
RateLimiter is similar to the JDK semaphores Semphore. It is used to limit the number of threads for concurrent resource access. This article introduces the use of RateLimiter.
Sample Code
Import java. util. concurrent. callable; import java. util. concurrent. executionException; import java. util. concurrent. executors; import java. util. concurrent. timeUnit; import com. google. common. util. concurrent. futureCallback; import com. google. common. util. concurrent. futures; import com. google. common. util. concurrent. listenableFuture; import com. google. common. util. concurrent. listeningExecutorService; import com. google. common. util. concurrent. moreExecutors; import com. google. common. util. concurrent. rateLimiter; public class ListenableFutureDemo {public static void main (String [] args) {testRateLimiter (); testListenableFuture ();}/*** RateLimiter is similar to the JDK semaphores Semphore, it is used to limit the number of threads for concurrent Resource Access */public static void testRateLimiter () {ListeningExecutorService executorService = MoreExecutors. listeningDecorator (Executors. newCachedThreadPool (); RateLimiter limiter = RateLimiter. create (5.0); // no more than 4 tasks per second are submitted for (int I = 0; I <10; I ++) {limiter. acquire (); // when RateLimiter is requested, if permits is exceeded, final ListenableFuture <Integer> listenableFuture = executorService. submit (new Task ("is" + I) ;}} public static void testListenableFuture () {ListeningExecutorService executorService = MoreExecutors. listeningDecorator (Executors. newCachedThreadPool (); final ListenableFuture <Integer> listenableFuture = executorService. submit (new Task ("testListenableFuture"); // synchronously obtain the call result try {System. out. println (listenableFuture. get ();} catch (InterruptedException e1) {e1.printStackTrace ();} catch (ExecutionException e1) {e1.printStackTrace ();} // listenableFuture method 1. addListener (new Runnable () {@ Override public void run () {try {System. out. println ("get listenable ure's result" + listenableFuture. get ();} catch (InterruptedException e) {e. printStackTrace ();} catch (ExecutionException e) {e. printStackTrace () ;}}, executorService); // method 2 Futures. addCallback (listenableFuture, new FutureCallback <Integer> () {@ Override public void onSuccess (Integer result) {System. out. println ("get listenable future's result with callback" + result);} @ Override public void onFailure (Throwable t) {t. printStackTrace () ;}}) ;}} class Task implements Callable <Integer> {String str; public Task (String str) {this. str = str ;}@ Override public Integer call () throws Exception {System. out. println ("call execute .. "+ str); TimeUnit. SECONDS. sleep (1); return 7 ;}}
Guava version
<dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>14.0.1</version> </dependency>
Summary
The above is all about RateLimit-using guava for interface throttling code examples. I hope it will be helpful to you. If you are interested, you can continue to refer to other related topics on this site. If you have any shortcomings, please leave a message. Thank you for your support!