Typically, when we define a global variable (well, I say this because the need for explanations--global variables are bad), we use a function to access them that can be understood by Python:
bar =
def foo ():
print Bar
Here, we use the global variable bar in the Foo function, and then it works as expected:
It's cool to do that. Usually, after we use this feature, we want to use it in all the code. If used in the following example, it will work correctly:
bar = [All]
def foo ():
bar.append (0)
foo ()
>>> print Bar
[42, 0]
But what if we change bar:
>>> bar = ...
def foo ():
... Bar = 0 ...
foo () ...
print bar
42
We can see that the Foo function works fine and does not throw an exception, but when we print the value of bar it will find that its value is still 42. The reason for this is bar=0 this line of code, which does not change the value of the global variable bar, but instead creates a local variable named bar and its value is 0. It's a very difficult bug to find, and it makes it very painful for novices who don't really understand the python scope. To understand how Python handles local and global variables, let's look at a rarer, but perhaps even more confusing, error where we print the bar's value and define a local variable called bar:
bar =
def foo ():
print bar
bar = 0
There's no mistake in writing this way, is it? We have defined variables with the same name after printing the value, so this should not be affected (Python is, after all, an interpreted language), is that really true?
There was a mistake.
How is that possible? Well, here are two mistakes. The 1th is about Python, as an interpretive language (very cool, we all agree on this), executed on a line. In fact, Python is a declaration of a declaration executed. To make you feel a little bit about what I want to say, open your favorite shell and enter the following code:
Press ENTER. As you can see, the shell does not produce any output but waits for you to continue the definition of the function. The shell will always be like this until you stop defining the function. This is because the definition function is a declaration. Well, this is a mixed statement that contains some other declarations, but it is still a statement. The contents of this function will not be executed until the function is called. What really executes is a function-type object that was created.
This leads us to focus on the 2nd. Again, the dynamic and interpretive nature of Python makes us believe that when print bar is executed, Python looks for variables called bar in the local scope and then looks at the global scope. But what actually happens is that the local scope is not completely dynamic. When the Def declaration executes, Python gets the information statically from the local scope of the function. When it comes to bar=0 (not executing to this line of code, but when the Python interpreter reads this line of code), it adds the ' bar ' variable to the list of local variables in the Foo function. When the Foo function executes and Python prepares to execute the row of print bar, it looks for the variable in the local scope, because the process is static and Python knows that the variable has not been assigned, the variable has no value, and throws an exception.
You might ask: Why not throw this exception when declaring a function? Python can know beforehand that the variable is referenced before assigning a value. The answer to this question is that Python has no way of knowing whether this local variable bar is assigned. Take a look at the following example:
Bar = '
def foo (baz):
if baz > 0:
print bar
bar = 0
Python plays a subtle game between the dynamic and the static. The only thing it knows is that bar is assigned, but it does not know whether the exception is referenced before it is assigned until it really happens. Well, to be honest, it doesn't even know if the variable is assigned!
Bar = with
def foo ():
print Bar
if False:
bar = 0
>>> foo ()
Traceback (most recent call Las T):
file "<pyshell#17>", line 1, in <module>
foo ()
file "<pyshell#16>", line 3, in FOO
print bar
unboundlocalerror:local variable ' bar ' referenced before assignment
See the code above, although we as a smart creature know very well that we do not assign values to bar. Python ignores that fact and still declares the bar as a local variable.
I have said enough about this question. What we need is a solution, and I'll give you two solutions here.
>>> bar = ...
def foo ():
... Global Bar
... Print Bar
... Bar = 0
...
... foo ()
>>> Bar
0
The first is to use the Global keyword. This is self-evident. This will let Python know that bar is a global variable, not a local variable.
The second method, which is also more recommended, is not to use global variables. The Global keyword has never been used in my extensive Python development effort. It's OK to know how to use it, but in the end try to avoid it. If you want to keep the value in your code from the beginning to the end, define it as a property of a class. In this way, you don't need to use global at all, when you want to use this value, you can access it through the properties of the class:
>>> class Baz (object):
... Bar = ...
... def foo ():
... Print Baz.bar # global
... Bar = 0 # local
... Baz.bar = 8 # global
... Print Bar
...
.. foo ()
... print baz.bar
0
8