Python PEAK for Protocol Adaptation tutorial, pythonpeak
If you are trying to process metabases, or are suffering from asynchronous programming in Twisted, or are studying object-oriented programming that is exhausting you by using multiple distributions, you are totally wrong! PEAK combines all of these elements into a component programming framework. PEAK also has some minor problems. Similar to Twisted, PEAK documentation-as many as possible-is hard to understand. However, for Python leader Phillip J. this project led by Eby is worth noting. In addition, I think there is a chance to develop highly productive applications.
The PEAK package consists of many sub-packages for different purposes. Some important sub-packages include peak. api, peak. binding, peak. config, peak. naming, and peak. storage. Most of those names are self explanatory. Sub-package peak. binding is used for flexible connections between components; peak. config allows you to store "lazily immutable" data, which is related to declarative application programming; peak. naming allows you to create globally unique identifiers for (network) resources; peak. as the name implies, storage allows you to manage databases and persistent content.
However, for this article, we will focus on peak. api. In particular, the PyProtocols package can be separately obtained and provide an infrastructure for other PEAK Sub-packages. The peak. api. protocols contains a version of the PyProtocols package. But now I am interested in studying an independent protocols package. In the future, I will return to discuss other topics of PEAK.
What is a protocol?
Abstraction: a protocol is only a group of actions that an object agrees to follow. The stronugly-typed programming language, including Python, has a set of basic types. Each basic type has a set of guaranteed behaviors: integer knows how to calculate their own product; list knows how to traverse their content; dictionary knows how to find the corresponding value based on a keyword; file knows how to read and write bytes; and so on. You can expect a set of built-in behaviors to constitute a protocol they implement. Objects that systematize protocols are called interfaces ).
For standard types, listing all the actions implemented is not too difficult (although different Python versions may be slightly different; or, of course, there may be differences between different programming languages ). However, in the boundary -- for an object belonging to a custom class -- it is difficult to declare what constitutes a "Class-dictionary" or "Class-file" behavior. In most cases, only user-defined objects such as a subset of built-in dict methods-or even a fairly small subset-are implemented, it is enough "Class-dictionary" to meet the current requirements. However, it is very attractive to explicitly sort out the functions, modules, classes, or frameworks used by an object. That is what the PyProtocols package does (part ).
In a programming language with a static type declaration, in order to use data in a new context, you usually need to force cast or convert from a type) to another type. In other languages, conversions are implicitly performed based on context needs. These are called forced coercions ). In Python, both forced type conversion and forced same type are used. Generally, the former is used ("explicit is better than implicit "). You can add a floating point to an integer to obtain a more general floating point. However, if you want to convert the string "3.14" to a number, then you need to use the explicit constructor float ("3.14 ").
PyProtocols has a feature called adaptation, similar to the unorthodox Computer Science Concept of "partial typing. Adaptation may also be considered as "The Force Type of acceleration ". If an interface defines a set of required capabilities (that is, the object method), the object to be "all needed" requires adaptation-through protocols. adapt () function implementation-to provide the required capabilities. Obviously, if you have an explicit conversion function that can convert an object of Type X to an object of Type Y (here Y implements an IY Interface ), then the function must be able to adapt X to the IY protocol. However, adaptation in PyProtocols can do much more than this. For example, if you have never explicitly written a conversion program from Type X to Type Y, adapt () we can usually present a way for X to provide the capabilities required by IY (that is, to find the interfaces from X to IZ, from IZ to IW, and then convert from IW to IY ).
Declare interfaces and adapters
There are many different methods in PyProtocols to create interfaces and adapters. The PyProtocols document describes these technologies in great detail-many of them will not be covered in this article. Next we will go into some details. However, I think it is a useful method to give a simplified example of the actual PyProtocols code here.
For example, I decided to create a class-Lisp serialization for a Python object. Its description is not an accurate Lisp syntax, and I do not care about the exact advantages and disadvantages of this format. Here, my idea is to create a function that can execute jobs similar to the repr () function or the pprint module, but the result is both with the previous serializers) there are obvious differences, and it is also easier to expand/customize. For example, we made a very different choice from Lisp: mappings is a more basic data structure (tuple) than list) or the list is processed as a ing with consecutive integers as the key ). The following code is used:
Lispy. py PyProtocol Definition
from protocols import *from cStringIO import StringIO# Like unicode, & even support objects that don't explicitly support ILispILisp = protocolForType(unicode, ['__repr__'], implicit=True)# Class for interface, but no methods specifically requiredclass ISeq(Interface): pass# Class for interface, extremely simple mapping interfaceclass IMap(Interface): def items(): "A requirement for a map is to have an .items() method"# Define function to create an Lisp like representation of a mappingdef map2Lisp(map_, prot): out = StringIO() for k,v in map_.items(): out.write("(%s %s) " % (adapt(k,prot), adapt(v,prot))) return "(MAP %s)" % out.getvalue()# Use this func to convert an IMap-supporting obj to ILisp-supporting objdeclareAdapter(map2Lisp, provides=[ILisp], forProtocols=[IMap])# Note that a dict implements an IMap interface with no conversion neededdeclareAdapter(NO_ADAPTER_NEEDED, provides=[IMap], forTypes=[dict])# Define and use func to adapt an InstanceType obj to the ILisp interfacefrom types import InstanceTypedef inst2Lisp(o, p): return "(CLASS '(%s) %s)" % (o.__class__.__name__, adapt(o.__dict__,p))declareAdapter(inst2Lisp, provides=[ILisp], forTypes=[InstanceType])# Define a class to adapt an ISeq-supporting obj to an IMap-supporting objclass SeqAsMap(object): advise(instancesProvide=[IMap], asAdapterForProtocols=[ISeq] ) def __init__(self, seq, prot): self.seq = seq self.prot = prot def items(self): # Implement the IMap required .items() method return enumerate(self.seq)# Note that list, tuple implement an ISeq interface w/o conversion neededdeclareAdapter(NO_ADAPTER_NEEDED, provides=[ISeq], forTypes=[list, tuple])# Define a lambda func to adapt str, unicode to ILisp interfacedeclareAdapter(lambda s,p: "'(%s)" % s, provides=[ILisp], forTypes=[str,unicode])# Define a class to adapt several numeric types to ILisp interface# Return a string (ILisp-supporting) directly from instance constructorclass NumberAsLisp(object): advise(instancesProvide=[ILisp], asAdapterForTypes=[long, float, complex, bool] ) def __new__(klass, val, proto): return "(%s %s)" % (val.__class__.__name__.upper(), val)
In the code above, I have declared many adapters using different methods. In some cases, the Code converts an interface to another interface. In other cases, the type itself is directly adapted to another interface. I want you to note some aspects of the Code: (1) You have not created any adapter from list or tuple TO THE ILisp interface; (2) You have not explicitly declared the adapter for the int numeric type; (3) In this regard, there is no claim that the adapter from dict directly to ILisp. The following shows how the code adapts to various Python objects:
Serialization of test_lispy.py object
from lispy import *from sys import stdout, stderrtoLisp = lambda o: adapt(o, ILisp)class Foo: def __init__(self): self.a, self.b, self.c = 'a','b','c'tests = [ "foo bar", {17:2, 33:4, 'biz':'baz'}, ["bar", ('f','o','o')], 1.23, (1L, 2, 3, 4+4j), Foo(), True,]for test in tests: stdout.write(toLisp(test)+'\n')
When running, we get:
Serialization result of test_lispy.py
$ python2.3 test_lispy.py'(foo bar)(MAP (17 2) ('(biz) '(baz)) (33 4) )(MAP (0 '(bar)) (1 (MAP (0 '(f)) (1 '(o)) (2 '(o)) )) )(FLOAT 1.23)(MAP (0 (LONG 1)) (1 2) (2 3) (3 (COMPLEX (4+4j))) )(CLASS '(Foo) (MAP ('(a) '(a)) ('(c) '(c)) ('(b) '(b)) ))(BOOL True)
It will be helpful to explain our output. The first line is relatively simple. We define an adapter directly from the string to the ILisp. The call to adapt ("foo bar", ILisp) Only returns the result of the lambda function. The next line is a little complicated. There is no adapter directly from dict to ILisp; but we don't have to use any adapter to allow dict to adapt to IMap (we have declared enough), and we have an adapter from IMap to ILisp. Similarly, for the following lists and metadata groups, we can adapt ILisp to ISeq, adapt ISeq to IMap, and adapt IMap to ILisp. PyProtocols will point out the adaptation path to be adopted, and all these incredible processes are done behind the scenes. The process of an old-style instance is the same as that of a string or an object that supports IMap. We have a direct adaptation to ILisp.
But wait. How are all integers used in our dict and tuple objects processed? Numbers of the long, complex, float, and bool types have explicit adapters, but none of them exist. The trick here is that an int object already has one. the _ repr _ () method is a part of the ILisp interface that is implicitly supported. the _ repr _ () method is used to support the ILisp interface. In fact, as a built-in type, integers are represented by Arabic numerals without any modification, rather than the capitalized type initiator (such as LONG ).
Adaptation Protocol
Let's take a closer look at what the protocol. adapt () function has done. In our example, we use the declaration API to implicitly set a set of factories for adaptation )". This API has several layers. The "primitives" declared API is a function: declareAdaptorForType (), declareAdaptorForObject (), and declareAdaptorForProtocol (). In the preceding example, some high-level APIs are used, such as declareImplementation (), declareAdaptor (), adviceObject (), and protocolForType (). In one case, we can see that there is a "Fantastic" advise () Declaration in a class body. The advise () function supports a large number of keyword parameters used to configure the purpose of the suggested classes and roles. You can also recommend (advise () A module object.
You do not need to use the declarative API to create an object or interface that knows how to make the object adapt. Let's look at the call Tag of adapt () and explain its subsequent process. The call to adapt () is similar to the following:
Call tag of adapt ()
adapt(component, protocol, [, default [, factory]])
This indicates that you want the object component to adapt to the interface protocol. If default is specified, it can return a wrapper object or modify component. If the factory is specified as a keyword parameter, a conversion factory is used to generate the packaging or modification. However, let's take a look at the complete action sequence (simplified code) attempted by adapt ):
Hypothetical Implementation of adapt ()
if isinstance(component, protocol): return componentelif hasattr(component,'__conform__'): return component.__conform__(protocol)elif hasattr(protocol,'__adapt__'): return protocol.__adapt__(component)elif default is not None: return defaultelif factory is not None: return factory(component, protocol)else: NotImplementedError
The call to adapt () should maintain some features (but this is a programmer's advice, rather than the general mandatory requirements of the library ). The call to adapt () should be idempotent. That is to say, for an object x and a protocol P, we hope that: adapt (x, P) = adapt (x, P), P ). In advanced mode, the objective is similar to the objective of returning the self iterator (iterator) class from the. _ iter _ () method. Basically, you do not want to readapt to the same type you have adapted to produce fluctuating results.
It is worth noting that adaptation may be lossy. To adapt an object to an interface, it may be inconvenient or impossible to reinitialize all the information required by this object. That is to say, in general, for object x and Protocol P1 and P2: adapt (x, P1 )! = Adapt (x, P1), P2), P1 ).
Before the end, let's look at another test script that uses low-level behaviors of adapt:
Serialization of test_lispy2.py object
from lispy import *class Bar(object): passclass Baz(Bar): def __repr__(self): return "Represent a "+self.__class__.__name__+" object!"class Bat(Baz): def __conform__(self, prot): return "Adapt "+self.__class__.__name__+" to "+repr(prot)+"!"print adapt(Bar(), ILisp)print adapt(Baz(), ILisp)print adapt(Bat(), ILisp)print adapt(adapt(Bat(), ILisp), ILisp)$ python2.3 test_lispy2.py<__main__.Bar object at 0x65250>Represent a Baz object!Adapt Bat to WeakSubset(<type 'unicode'>,('__repr__',))!'(Adapt Bat to WeakSubset(<type 'unicode'>,('__repr__',))!)
The result shows that the design of lispy. py cannot meet the idempotent goal. Improving this design may be a good exercise. However, a description like ILisp will definitely consume the information in the original object (this is okay ).
Conclusion
It seems that PyProtocols has something in common with other "external" topics mentioned in this column. First, the declared API is declarative (relative to the explanatory ). Declarative Programming does not provide the steps and switches required to execute an action. Instead, it declares to process specific content and the library or compiler specifies how to execute it. The names "declare * ()" and "advice * ()" come from this point of view.
However, I also found that PyProtocols programming is somewhat similar to multi-Dispatch programming, specifically using the gnosis. magic. multimethods Module I mentioned in another article. In contrast to PyProtocols's decision adaptation path, my module executes a relatively simple deduction to determine the relevant ancestor classes to be assigned. However, both libraries tend to encourage the use of similar modular ideas in programming-a large number of small functions or classes are used to execute "pluggable" tasks without the need for rigid class hierarchies. In my opinion, this style has its advantages.