標籤:flask rgs exception param script spec app rpo methods
工作中遇到一個問題,就是有一些需要對資料庫做全表掃描,而且對結果要求比較寬鬆的地方,總覺得可以找地方最佳化,比如暫時儲存計算結果。
首先想起來的就是functools.lru_cache,但是可惜在python2.7中沒有這個裝飾器。
然後就是在stackoverflow找了一個:
(來源:https://stackoverflow.com/questions/11815873/memoization-library-for-python-2-7)
1 import time 2 import functools 3 import collections 4 5 def lru_cache(maxsize = 255, timeout = None): 6 """lru_cache(maxsize = 255, timeout = None) --> returns a decorator which returns an instance (a descriptor). 7 8 Purpose - This decorator factory will wrap a function / instance method and will supply a caching mechanism to the function. 9 For every given input params it will store the result in a queue of maxsize size, and will return a cached ret_val10 if the same parameters are passed.11 12 Params - maxsize - int, the cache size limit, anything added above that will delete the first values enterred (FIFO).13 This size is per instance, thus 1000 instances with maxsize of 255, will contain at max 255K elements.14 - timeout - int / float / None, every n seconds the cache is deleted, regardless of usage. If None - cache will never be refreshed.15 16 Notes - If an instance method is wrapped, each instance will have it‘s own cache and it‘s own timeout.17 - The wrapped function will have a cache_clear variable inserted into it and may be called to clear it‘s specific cache.18 - The wrapped function will maintain the original function‘s docstring and name (wraps)19 - The type of the wrapped function will no longer be that of a function but either an instance of _LRU_Cache_class or a functool.partial type.20 21 On Error - No error handling is done, in case an exception is raised - it will permeate up.22 """23 24 class _LRU_Cache_class(object):25 def __init__(self, input_func, max_size, timeout):26 self._input_func = input_func27 self._max_size = max_size28 self._timeout = timeout29 30 # This will store the cache for this function, format - {caller1 : [OrderedDict1, last_refresh_time1], caller2 : [OrderedDict2, last_refresh_time2]}.31 # In case of an instance method - the caller is the instance, in case called from a regular function - the caller is None.32 self._caches_dict = {}33 34 def cache_clear(self, caller = None):35 # Remove the cache for the caller, only if exists:36 if caller in self._caches_dict:37 del self._caches_dict[caller]38 self._caches_dict[caller] = [collections.OrderedDict(), time.time()]39 40 def __get__(self, obj, objtype):41 """ Called for instance methods """42 return_func = functools.partial(self._cache_wrapper, obj)43 return_func.cache_clear = functools.partial(self.cache_clear, obj)44 # Return the wrapped function and wraps it to maintain the docstring and the name of the original function:45 return functools.wraps(self._input_func)(return_func)46 47 def __call__(self, *args, **kwargs):48 """ Called for regular functions """49 return self._cache_wrapper(None, *args, **kwargs)50 # Set the cache_clear function in the __call__ operator:51 __call__.cache_clear = cache_clear52 53 54 def _cache_wrapper(self, caller, *args, **kwargs):55 # Create a unique key including the types (in order to differentiate between 1 and ‘1‘):56 kwargs_key = "".join(map(lambda x : str(x) + str(type(kwargs[x])) + str(kwargs[x]), sorted(kwargs)))57 key = "".join(map(lambda x : str(type(x)) + str(x) , args)) + kwargs_key58 59 # Check if caller exists, if not create one:60 if caller not in self._caches_dict:61 self._caches_dict[caller] = [collections.OrderedDict(), time.time()]62 else:63 # Validate in case the refresh time has passed:64 if self._timeout != None:65 if time.time() - self._caches_dict[caller][1] > self._timeout:66 self.cache_clear(caller)67 68 # Check if the key exists, if so - return it:69 cur_caller_cache_dict = self._caches_dict[caller][0]70 if key in cur_caller_cache_dict:71 return cur_caller_cache_dict[key]72 73 # Validate we didn‘t exceed the max_size:74 if len(cur_caller_cache_dict) >= self._max_size:75 # Delete the first item in the dict:76 cur_caller_cache_dict.popitem(False)77 78 # Call the function and store the data in the cache (call it with the caller in case it‘s an instance function - Ternary condition):79 cur_caller_cache_dict[key] = self._input_func(caller, *args, **kwargs) if caller != None else self._input_func(*args, **kwargs)80 return cur_caller_cache_dict[key]81 82 83 # Return the decorator wrapping the class (also wraps the instance to maintain the docstring and the name of the original function):84 return (lambda input_func : functools.wraps(input_func)(_LRU_Cache_class(input_func, maxsize, timeout)))
但是會出現一個問題,那就是以後部署的話,會有多個伺服器部署在nginx後面,但是這些緩衝結果是儲存在單個伺服器的,那麼就會在不同的請求結果就可能出現不一致,那麼怎麼辦?
放進redis?
然後就想起來了flask-cache,但是可惜,如果用這個緩衝普通函數的計算結果會報錯。
最後,只能自己動手寫一個了:
1 def cache_func_redis(timeout=100): 2 def decorator(func): 3 @wraps(func) 4 def wrapper(*args,**kwargs): 5 lst_dct = sorted([{k: kwargs[k]} for k in kwargs], key=lambda d:d.keys()[0]) 6 lst = [str(d.values()[0]) for d in lst_dct] 7 k = ‘‘.join([func.__name__, str(args), ‘‘.join(lst)]) 8 r = redis.Redis(connection_pool=cache_redis) 9 d = r.get(k)10 if d:11 res = json.loads(d)[‘res‘]12 return res13 res = func(*args, **kwargs)14 d = json.dumps({15 ‘res‘: res16 })17 r.set(k, d)18 r.expire(k, timeout)19 return res20 return wrapper21 return decorator
利用函數名和傳入的參數,提取特徵值作為redis中存入的名字,把計算結果存入redis,失效時間為timeout,但是需要注意的是,
*如果傳入的參數為字典,那麼可能不會被命中
*被緩衝的結果必須為對準確性時效性要求不高的地方
*被緩衝的結果應該為基本的python資料結構,否則可能會報錯
*還沒有做壓力測試,等做了壓力測試把結果傳上來
參考資料:
https://github.com/python/cpython/blob/3.4/Lib/functools.py
https://stackoverflow.com/questions/11815873/memoization-library-for-python-2-7
[python]緩衝函數結果進redis