[Python] cache function results into Redis

Source: Internet
Author: User
Tags instance method

Work encountered a problem, is that there are some need to do a full table scan of the database, and the results are more relaxed, the total feel can find local optimization, such as temporarily save the calculation results.

The first thing to think about is functools.lru_cache, but unfortunately there is no such decorator in python2.7.

Then it was in StackOverflow to find one:

(Source: https://stackoverflow.com/questions/11815873/memoization-library-for-python-2-7)

1 Import Time2 ImportFunctools3 ImportCollections4 5 defLru_cache (maxsize = 255, timeout =None):6     """Lru_cache (maxsize = 255, timeout = None)--Returns a decorator which returns an instance (a descriptor).7 8 purpose-this Decorator Factory would wrap a function/instance method and would supply a caching MEC Hanism to the function.9 for every given input params it would store the result in a queue of maxsize size, and would Return a cached Ret_valTen if the same parameters is passed. One  A Params-maxsize-int, the cache size limit, anything added above that would delete the first values Enterred (FIFO). - This size is per instance, thus-instances with maxsize of 255, would contain at Max 25 5K elements. - -Timeout-int/float/none, every n seconds the cache is deleted, regardless of usage. If None-cache'll never be refreshed. the  - Notes-if An instance method was wrapped, each instance would have it ' s own cache and it's own Timeou T. - -The wrapped function'll has a cache_clear variable inserted into it and May is called to Clear it ' s specific cache. - -the wrapped function would maintain the original function ' s docstring and Name (wraps) + -the type of the wrapped function would no longer is that's a function but either an instance of _lru_cache_class or a functool.partial type. -  + On error-no Error handling is do, in case an exception is raised-it would permeate up. A     """ at  -     class_lru_cache_class (object): -         def __init__(self, input_func, max_size, timeout): -Self._input_func =Input_func -Self._max_size =max_size -Self._timeout =Timeout in  -             #This would store the cache for this function, format-{caller1: [OrderedDict1, Last_refresh_time1], caller2: [Orde RedDict2, Last_refresh_time2]}. to             #in case of an instance method-the caller are the instance, in case called from a regular function-the caller are None. +Self._caches_dict = {} -  the         defCache_clear (self, caller =None): *             #Remove the cache for the caller, only if exists: $             ifCallerinchself._caches_dict:Panax Notoginseng                 delSelf._caches_dict[caller] -Self._caches_dict[caller] =[Collections. Ordereddict (), Time.time ()] the  +         def __get__(self, obj, objtype): A             """called for instance methods""" theReturn_func =functools.partial (self._cache_wrapper, obj) +Return_func.cache_clear =functools.partial (self.cache_clear, obj) -             #Return the wrapped function and wraps it to maintain the docstring and the name of the original function: $             returnFunctools.wraps (Self._input_func) (Return_func) $  -         def __call__(Self, *args, * *Kwargs): -             """called for regular functions""" the             returnSelf._cache_wrapper (None, *args, * *Kwargs) -         #Set the Cache_clear function in the __call__ operator:Wuyi         __call__. Cache_clear =Cache_clear the  -  Wu         def_cache_wrapper (self, caller, *args, * *Kwargs): -             #Create a unique key including the types (in order to differentiate between 1 and ' 1 '): AboutKwargs_key ="". Join (Map (LambdaX:STR (x) + str (type (kwargs[x)) +str (kwargs[x]), sorted (Kwargs))) $Key ="". Join (Map (LambdaX:str (Type (x)) + str (x), args) +Kwargs_key -  -             #Check If caller exists, if not create one: -             ifCaller not inchself._caches_dict: ASelf._caches_dict[caller] =[Collections. Ordereddict (), Time.time ()] +             Else: the                 #Validate in case the refresh time has passed: -                 ifSelf._timeout! =None: $                     ifTime.time ()-self._caches_dict[caller][1] >self._timeout: the self.cache_clear (caller) the  the             #Check If the key exists, if So-return it: theCur_caller_cache_dict =Self._caches_dict[caller][0] -             ifKeyinchcur_caller_cache_dict: in                 returnCur_caller_cache_dict[key] the  the             #Validate We didn ' t exceed the max_size: About             ifLen (cur_caller_cache_dict) >=self._max_size: the                 #Delete the first item in the dict: the Cur_caller_cache_dict.popitem (False) the  +             #Call the function and store the data in the cache (call it with the caller in case it's an instance Function-ternar Y condition): -Cur_caller_cache_dict[key] = Self._input_func (caller, *args, **kwargs)ifCaller! = NoneElseSelf._input_func (*args, * *Kwargs) the             returnCur_caller_cache_dict[key]Bayi  the  the     #Return The Decorator wrapping the class (also wraps the instance to maintain the docstring and the name of the origin Al function): -     return(LambdaInput_func:functools.wraps (Input_func) (_lru_cache_class (Input_func, maxsize, timeout)))

However, there will be a problem, that is, in the future deployment, there will be multiple servers deployed behind Nginx, but these cached results are stored on a single server, then the different request results may be inconsistent, then what?

Put it in Redis?

Then I think of the flask-cache, but unfortunately, if you use this cache normal function calculation results will be error.

Finally, I can only write one by myself:

1 defCache_func_redis (timeout=100):2     defDecorator (func):3 @wraps (func)4         defWrapper (*args,**Kwargs):5LST_DCT = Sorted ([{k:kwargs[k]} forKinchKwargs], key=LambdaD:d.keys () [0])6LST = [Str (d.values () [0]) forDinchLST_DCT]7K ="'. join ([func.__name__, str (args),"'. Join (LST)])8R = Redis. Redis (connection_pool=Cache_redis)9D =R.get (k)Ten             ifD: Oneres = json.loads (d) ['Res'] A                 returnRes -res = func (*args, * *Kwargs) -D =Json.dumps ({ the                 'Res': Res -             }) - R.set (k, D) - R.expire (k, timeout) +             returnRes -         returnwrapper +     returnDecorator

Using the function name and the passed parameters, extract the eigenvalue as the name deposited in the Redis, the result of the calculation into Redis, the expiration time is timeout, but it should be noted that

* If the parameter passed in is a dictionary, it may not be hit

* The cached result must be a place where the timeliness of accuracy is not high

* The cached result should be a basic python data structure, otherwise you may get an error

* We haven't done stress tests, we've done stress tests, we've got the results.

Resources:

https://github.com/python/cpython/blob/3.4/Lib/functools.py

Https://stackoverflow.com/questions/11815873/memoization-library-for-python-2-7

[Python] cache function results into Redis

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.