Synchronization Process
import subprocess
cmd = ('tail', '/tmp/test.log')
sp = subprocess.Popen(cmd, stdout=subprocess.PIPE,stderr=subprocess.PIPE)
if sp.wait() == 0:
print 'exec command succussful.'
else:
print sp.stderr.read()
Asynchronous Process
import subprocess
cmd = ('tail', '-f', '/tmp/test.log')
sp = subprocess.Popen(cmd, stdout=subprocess.PIPE,stderr=subprocess.PIPE)
while True:
if sp.poll() is not None:
print 'exec command completed.'
else:
print sp.stdout.readline()
Process Communication
sp=subprocess.Popen("dir", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdoutput,erroutput) = sp.communicate()
SP. communicate waits until the process exits and returns the standard output and standard error output. In this way, the output of the sub-process can be obtained. above, the standard output and standard error output are separated, you can also set the stderr parameter to subprocess. stdout:
sp=subprocess.Popen("dir", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
(stdoutput,erroutput) = sp.communicate()
If you want to process the output of a sub-process in one row, there is no problem:
sp=subprocess.Popen("dir", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
buff = sp.stdout.readline()
if buff == '' and sp.poll() != None:
break
else:
print buff
DeadlockIf you use a pipeline without processing the output of the pipeline, be careful. If the sub-process outputs too much data, the deadlock will occur. For example:
sp=subprocess.Popen("longprint", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
sp.wait()
Longprint is a hypothetical process with a large number of outputs. In my XP and python2.5 environments, when the output reaches 4096, the deadlock occurs. Of course, if we use sp. stdout. Readline or sp. Communicate to clear the output, the deadlock will not occur no matter how many outputs are output. Or we can avoid deadlocks without using pipelines, such as redirection or redirection to files.