This article mainly introduces how to get started with metaprogramming in Python, which is from the technical documentation on the IBM official website. For more information, see
Review object-oriented programming
Let's take 30 seconds to review what OOP is. In object-oriented programming languages, classes can be defined. Their purpose is to bind relevant data and behavior together. These classes can inherit part or all of their parent classes, but can also define their own attributes (data) or methods (behavior ). When the process of defining a class ends, the class usually acts as a template used to create an instance (sometimes called an object. Different instances of the same class usually have different data, but the "appearance" is the same-for example, the Employee objects bob and jane both have. salary and. room_number, but the two rooms and salaries are different.
Some OOP languages (including Python) allow objects to be introspective (also called reflection ). That is, the introspection object can describe itself: Which class does the instance belong? What are the class's ancestor? What methods and attributes can be used for objects? Introspection allows the function or method that processes objects to make decisions based on the object type passed to the function or method. Even if there is no introspection, the function is often divided according to the instance data. for example, the route to jane. room_number is different from that to bob. room_number because they are in different rooms. Using introspection, we can also safely calculate the bonus of jane while skipping the calculation of bob, for example, because jane has. the profit_share attribute, or because bob is an instance of the Hourly (Employee) subclass.
Metaprogramming
The basic OOP system described above has powerful functions. However, one element in the above description is not taken seriously: in Python (and other languages), classes themselves are objects that can be passed and introspective. As mentioned above, since classes can be used as templates to generate objects, what can be used as templates to generate classes? The answer is metaclass ).
Python has always had metadata. However, in Python 2.2, the methods involved in the metadata can be better exposed. Python V2.2 explicitly does not use only one special (usually hidden) Metadatabase to create each class object. Now, programmers can create subclass of the original Metadatabase type, or even dynamically generate classes using various Metadatabase classes. Of course, it is only because you can operate the MetaBase in Python 2.2, which does not indicate the reason you may want to do so.
Furthermore, you do not need to use custom meta-classes to generate operation classes. A rather difficult concept is a class factory: a common function that can return a class dynamically created in the function body. With the traditional Python syntax, you can write:
Listing 1. vintage Python 1.5.2 class factory
Python 1.5.2 (#0, Jun 27 1999, 11:23:01) [...]Copyright 1991-1995 Stichting Mathematisch Centrum, Amsterdam>>> def class_with_method(func):... class klass: pass... setattr(klass, func.__name__, func)... return klass...>>> def say_foo(self): print 'foo'...>>> Foo = class_with_method(say_foo)>>> foo = Foo()>>> foo.say_foo()foo
The factory function class_with_method () dynamically creates a class and returns the class, which contains the methods/functions passed to the factory. Operate the class itself in the function body before returning the class. The new module provides a simpler encoding method, but the options are different from those of the custom code in the class factory. for example:
Listing 2. class factory in the new module
>>> from new import classobj>>> Foo2 = classobj('Foo2',(Foo,),{'bar':lambda self:'bar'})>>> Foo2().bar()'bar'>>> Foo2().say_foo()foo
In all these cases, the behavior of the class (Foo and Foo2) is not directly written as code, but the behavior of the class is created by calling a function with dynamic parameters at runtime. It should be emphasized that not only instances can be dynamically created, but also classes can be dynamically created.
Meta: how to find a solution to the problem?
The magic of meta-classes is so great that 99% of users have no worries. If you want to know whether they are needed, you don't need them (people who actually need the metachinegis do know they need them and do not need to explain the reasons ). -Python expert Tim Peters
(Class) methods can return objects like normal functions. In this sense, class factories can be classes, just as they can be functions, which is obvious. In particular, Python 2.2 + provides a special class called type, which is such a class factory. Of course, readers will realize that type () is not as ambitious as the built-in functions of the old Python version-fortunately, the old version of type () the function is maintained by the type class (in other words, type (obj) returns the Object obj type/class ). As a new type class of the class factory, it works in the same way as new. classobj:
Listing 3. type of the class factory metadata
>>> X = type('X',(),{'foo':lambda self:'foo'})>>> X, X().foo()(
, 'foo')
However, because type is currently a (meta) class, you can use it to create a subclass:
Listing 4. as the type descendant of the class factory
>>> class ChattyType(type):... def __new__(cls, name, bases, dct):... print "Allocating memory for class", name... return type.__new__(cls, name, bases, dct)... def __init__(cls, name, bases, dct):... print "Init'ing (configuring) class", name... super(ChattyType, cls).__init__(name, bases, dct)...>>> X = ChattyType('X',(),{'foo':lambda self:'foo'})Allocating memory for class XInit'ing (configuring) class X>>> X, X().foo()(
, 'foo')
Full of "magic. _ new _ () and. the _ init _ () method is special, but in terms of concept, it works in the same way for any other classes. The. _ init _ () method enables you to configure the created object; the. _ new _ () method enables you to customize its allocation. Of course, the latter is not widely used, but this method exists for classes of every Python 2.2 new Style (typically inherited rather than overwritten.
Pay attention to a feature of type descendant; it often makes people who use the meta class for the first time "trap ". By convention, the first parameter of these methods is cls rather than self, because these methods operate on the generated class, rather than on the meta class. In fact, there is nothing special about this; all methods are attached to their instances, and the metadata instances are classes. Non-special names make this more obvious:
Listing 5. attaching class methods to the generated class
>>> class Printable(type):... def whoami(cls): print "I am a", cls.__name__...>>> Foo = Printable('Foo',(),{})>>> Foo.whoami()I am a Foo>>> Printable.whoami()Traceback (most recent call last):TypeError: unbound method whoami() [...]
All these surprising but common practices and easy-to-understand syntaxes make it easier to use metabases, but also confuse new users. Other syntaxes have several elements. However, the parsing sequence of these new variants requires some tips. The class can inherit the meta class from its ancestor-note that this is different from the meta class as its ancestor (this is another often confusing place ). For older classes, you can define a Global _ metaclass _ variable to force the use of custom metaclass. But most of the time, the safest way is to set the _ metaclass _ class attribute of the class when you want to create a class by customizing the meta class. You must set variables in the class definition itself, because if you set properties later (after you have created a class object), the meta class is not used. For example:
Listing 6. set the metadatabase with class attributes
>>> class Bar:... __metaclass__ = Printable... def foomethod(self): print 'foo'...>>> Bar.whoami()I am a Bar>>> Bar().foomethod()foo
Use this "magic" to solve the problem
So far, we have learned some basic knowledge about metadata. However, it is more complicated to use a metacharacter. The difficulty with using metabases is that in OOP design, classes do not actually do much. The inheritance structure of classes is useful for encapsulating and packaging data and methods, but in specific situations, instances are usually used.
We think that meta classes are useful in two types of programming tasks.
The first (probably more common) type is that you cannot know exactly what the class needs to do during design. Obviously, you know something about it, but a special detail may depend on the information that can be obtained later. "Later" itself has two types: (a) when the application uses the library module; (B) when running, when a certain situation exists. This class is very similar to what we call "Aspect-Oriented Programming (Aspect-Oriented Programming, AOP )". We will show an example that we think is very chic:
Listing 7. metadata configuration during running
% cat dump.py#!/usr/bin/pythonimport sysif len(sys.argv) > 2: module, metaklass = sys.argv[1:3] m = __import__(module, globals(), locals(), [metaklass]) __metaclass__ = getattr(m, metaklass)class Data: def __init__(self): self.num = 38 self.lst = ['a','b','c'] self.str = 'spam' dumps = lambda self: `self` __str__ = lambda self: self.dumps()data = Data()print data% dump.py<__main__.Data instance at 1686a0>
As you expected, the application prints a fairly general description of the data object (a common instance object ). However, if you pass the runtime parameters to the application, you can get quite different results:
Listing 8. add an external serialized Metaclass
% dump.py gnosis.magic MetaXMLPickler<?xml version="1.0"?>
This special example uses the serialization style of gnosis. xml. pickle, but the latest gnosis. magic package also contains MetaYamlDump, MetaPyPickler, and MetaPrettyPrint. In addition, users of the dump. py "application" can use this "MetaPickler" from any Python package that defines any expected MetaPickler ". For this purpose, compile the appropriate metadata class as follows:
Listing 9. Adding attributes with metadata
class MetaPickler(type): "Metaclass for gnosis.xml.pickle serialization" def __init__(cls, name, bases, dict): from gnosis.xml.pickle import dumps super(MetaPickler, cls).__init__(name, bases, dict) setattr(cls, 'dumps', dumps)
This arrangement is extraordinary because the application programmer does not need to know which serialization to use-or even whether to add serialization or other cross-section capabilities on the command line.
Perhaps the most common usage of MetaPickler is similar to that of MetaPickler: add, delete, rename, or replace the methods defined in the generated class. In our example, when creating class Data (and then creating each subsequent instance), "local" Data. the dump () method is replaced by a method other than the application.
Use this "magic" to solve the problem.
There is such a programming environment: classes are often more important than instances. For example, declarative mini-programming ages is a Python library that directly represents its program logic in class declaration. David has studied this issue in his article "Create declarative mini-definition ages. In this case, it is quite useful to use meta-classes to influence the class creation process.
A class-based declarative framework is gnosis. xml. validity. In this framework, many "validity classes" can be declared, which represent a set of constraints on valid XML documents. These declarations are very similar to those contained in the DTD. For example, you can use the following code to configure a "dissertation" document:
Listing 10. simple_diss.py gnosis. xml. validity rules
from gnosis.xml.validity import *class figure(EMPTY): passclass _mixedpara(Or): _disjoins = (PCDATA, figure)class paragraph(Some): _type = _mixedparaclass title(PCDATA): passclass _paras(Some): _type = paragraphclass chapter(Seq): _order = (title, _paras)class dissertation(Some): _type = chapter
If you try to instantiate the dissertation class without the correct component child element, a descriptive exception is generated. this is also true for each child element. When there is only one clear way to "upgrade" the parameter to the correct type, the correct child element is generated from a simpler parameter.
Even if the validity classes are often (informal) based on the pre-existing DTD, the instances of these classes still print themselves into simple XML document fragments, such:
Listing 11. creating basic validity documents
>>> from simple_diss import *>>> ch = LiftSeq(chapter, ('It Starts','When it began'))>>> print ch
It Starts
When it began
By using a meta class to create a validity class, we can generate a DTD from the class declaration (we can add an additional method to these validity classes while doing so ):
List 12. use metadata during Module import
>>> from gnosis.magic import DTDGenerator, \... import_with_metaclass, \... from_import>>> d = import_with_metaclass('simple_diss',DTDGenerator)>>> from_import(d,'**')>>> ch = LiftSeq(chapter, ('It Starts','When it began'))>>> print ch.with_internal_subset()<?xml version='1.0'?>
]>
It Starts
When it began
The package gnosis. xml. validity does not know the DTD and internal subset. The concepts and capabilities are completely introduced by the metadata class DTDGenerator, without any changes to gnosis. xml. validity or simple_diss.py. DTDGenerator does not set its own. the _ str _ () method replaces the class it generates-you can still print simple XML fragments-but the meta class can easily modify this "magic" method.
Convenience brought by RMB
To use metabases and sample metabases that can be used in aspect-oriented programming, the gnosis. magic package contains several utilities. The most important utility is import_with_metaclass (). The function used in the preceding example allows you to import third-party modules. However, you must create a custom meta class instead of a type to create all module classes. No matter what new capabilities you want to assign to a third-party module, you can define the capabilities within the created metadata (or from other places ). Gnosis. magic contains some pluggable serialized meta-classes; other packages may include tracking capabilities, object persistence, exception logging, or other capabilities.
The import_with_metclass () function shows several features of Metaprogramming:
Listing 13. import_with_metaclass () of [gnosis. magic ()
def import_with_metaclass(modname, metaklass): "Module importer substituting custom metaclass" class Meta(object): __metaclass__ = metaklass dct = {'__module__':modname} mod = __import__(modname) for key, val in mod.__dict__.items(): if inspect.isclass(val): setattr(mod, key, type(key,(val,Meta),dct)) return mod
In this function, it is worth noting that a common class Meta is generated using the specified Meta class. However, once Meta is added as an ancestor, custom Meta classes are also used to generate its descendants. In principle, a class like Meta can contain either metaclass producer or a set of inherited methods-these two aspects of the Meta class are irrelevant.