How to use the Python decorator

Source: Internet
Author: User
Tags unique id python decorator

This article mainly introduces the use of more in-depth Python adorners, Python's adorner is an important knowledge of the Python learning step, the need for friends can refer to the

Adorners are quite extensive in python, and if you've used some of the Python web frames, then it's no stranger to the route () decorator, so let's look at a specific case today.

Let's simulate a scene, need you to grab a page, and then this page has a lot of URLs to crawl separately, and enter these sub URLs, there are data to crawl. To be simple, we'll look at the three level, and our code is as follows:


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 def func_top (URL): data_dict= {} #在页面上获取到子url sub_urls = xxxx data_list = [] for it in Sub_urls:data_list.append (func _sub (IT)) data_dict[' data ' = data_list return data_dict def func_sub (URL): data_dict= {} #在页面上获取到子url Bottom_urls = xxxx Data_list = [] for it in Bottom_urls:data_list.append (Func_bottom (IT)) data_dict[' data ' = data_list return Data_dict def func_bottom (URL): #获取数据 data = xxxx return data

Func_top is the processing function of the upper page, func_sub is the processing function of the child page, Func_bottom is the most deep processing function of the page, Func_top will walk through the call func_sub,func_sub after fetching the child page URL.

If the normal situation, this has satisfied the demand, but this is the site you want to crawl may be extremely unstable, often linked to, resulting in data can not get.

So this time you have two choices:

1. Stop the error, then start again from the broken position

2. Continue to encounter errors, but to run again after, this time already have the data do not want to go to the site to pull once, but only to pull the data not taken

The first scenario is basically impossible, because if the URL of someone else's site is adjusted in order, then the location of your record is not valid. Then only the second option, plainly, is to get the data cache down, and so on when needed, directly from the cache to take.

OK, the target is already there, how to achieve it?

If it is in C + +, this is a very troublesome thing, and the written code must be ugly, but fortunately, we are using Python, and Python has an adorner for the function.

So the implementation of the program will have:

Define an adorner, if you get the data before, directly to the cache data, if not taken before, then pull from the site, and into the cache.

The code is as follows:


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 This is the ; > import OS import hashlib   def deco_args_recent_cache (category= ' dumps '): ' adorners, returning the latest cache data ' Def deco_recent_c Ache (func): Def func_wrapper (*args, **kargs): sig = _mk_cache_sig (*args, **kargs) data = _get_recent_cache (category, Func . __name__, SIG) if data isn't none:return data   data = func (*args, **kargs) if not None: _set_recent_cache (category, func.__name__, SIG, data) return to data   return Func_wrapper   return deco_recent_cache   def _MK _cache_sig (*args, **kargs): "Generate a unique ID by passing in parameters" Src_data = repr (args) + repr (Kargs) m = Hashlib.md5 (src_data) sig = M.he Xdigest () return SIG   Def _get_recent_cache (category, Func_name, sig): Full_file_path = '%s/%s/%s '% (category, Func _name, SIG) if OS.PATH.ISFIle (Full_file_path): Return eval (Full_file_path, ' R '). Read ()) Else:return None   def _set_recent_cache ( Category, Func_name, SIG, data): Full_dir_path = '%s/%s '% (category, func_name) if not Os.path.isdir (Full_dir_path): OS.M Akedirs (full_dir_path)   Full_file_path = '%s/%s/%s '% (category, Func_name, sig) F = File (Full_file_path, ' w+ ') F.WR ITE (repr (data)) F.close ()

Then, we just need to add deco_args_recent_cache this adorner in each Func_top,func_sub,func_bottom ~ ~

Get it done! The biggest advantage of this is that, because top,sub,bottom, each layer will dump data, so such as a sub-layer data dump, is not going to go to his corresponding bottom layer, reducing a lot of overhead!

OK, so ~ life is short, I use python!

Related Article

E-Commerce Solutions

Leverage the same tools powering the Alibaba Ecosystem

Learn more >

Apsara Conference 2019

The Rise of Data Intelligence, September 25th - 27th, Hangzhou, China

Learn more >

Alibaba Cloud Free Trial

Learn and experience the power of Alibaba Cloud with a free trial worth $300-1200 USD

Learn more >

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.