How can I call an external command (as if I'd typed it at the Unix shell or Windows command prompt) from within a Python script?
Join them; it only takes a minute:
|
Look at the subprocess module in the stdlib:
The advantage of subprocess vs system is that it is more flexible (you can get the stdout, stderr, the "real" status code, better error handling, etc...). I think |
|||||||||||||||||||||
|
|
Here's a summary of the ways to call external programs and the advantages and disadvantages of each:
The Finally please be aware that for all methods where you pass the final command to be executed by the shell as a string and you are responsible for escaping it. There are serious security implications if any part of the string that you pass can not be fully trusted. For example, if a user is entering some/any part of the string. If you are unsure, only use these methods with constants. To give you a hint of the implications consider this code:
and imagine that the user enters "my mama didnt love me && rm -rf /". |
|||||||||||||||||||||
|
|
I typically use:
You are free to do what you want with the |
|||||||||||||||||||||
|
|
Some hints on detaching the child process from the calling one (starting the child process in background). Suppose you want to start a long task from a CGI-script, that is the child process should live longer than the CGI-script execution process. The classical example from the subprocess module docs is:
The idea here is that you do not want to wait in the line 'call subprocess' until the longtask.py is finished. But it is not clear what happens after the line 'some more code here' from the example. My target platform was freebsd, but the development was on windows, so I faced the problem on windows first. On windows (win xp), the parent process will not finish until the longtask.py has finished its work. It is not what you want in CGI-script. The problem is not specific to Python, in PHP community the problems are the same. The solution is to pass DETACHED_PROCESS flag to the underlying CreateProcess function in win API. If you happen to have installed pywin32 you can import the flag from the win32process module, otherwise you should define it yourself:
/* UPD 2015.10.27 @eryksun in a comment below notes, that the semantically correct flag is CREATE_NEW_CONSOLE (0x00000010) */ On freebsd we have another problem: when the parent process is finished, it finishes the child processes as well. And that is not what you want in CGI-script either. Some experiments showed that the problem seemed to be in sharing sys.stdout. And the working solution was the following:
I have not checked the code on other platforms and do not know the reasons of the behaviour on freebsd. If anyone knows, please share your ideas. Googling on starting background processes in Python does not shed any light yet. |
|||||||||||||||||||||
|
|
I'd recommend using the subprocess module instead of os.system because it does shell escaping for you and is therefore much safer: http://docs.python.org/library/subprocess.html
|
|||||||||
|
If you want to return the results of the command, you can use |
|||||
|
Note that this is dangerous, since the command isn't cleaned. I leave it up to you to google for the relevant docs on the 'os' and 'sys' modules. There are a bunch of functions (exec* , spawn*) that will do similar things. |
|||
|
|
|
Check "pexpect" python library, too. It allows for interactive controlling of external programs/commands, even ssh, ftp, telnet etc. You can just type something like:
|
||||
|
|
|
I always use
But this seem to be a good tool: Look an example:
|
|||||
|
|
If what you need is the output from the command you are calling,
Also note the shell parameter.
|
||||
|
|
|
This is how I run my commands. This code has everything you need pretty much
|
|||||||||||||
|
With Standard LibraryUse subprocess module:
It is the recommended standard way. However, more complicated tasks (pipes, output, input, etc.) can be tedious to construct and write. Note: shlex.split can help you to parse the command for
With External DependenciesIf you do not mind external dependencies, use plumbum:
It is the best Another popular library is sh:
However, |
||||
|
|
|
without the output of result
with output of result
|
|||||
|
Update:
Here's some examples from the docs. Run a process:
Raise on failed run:
Capture output:
Original answer:I recommend trying Envoy. It's a wrapper for subprocess, which in turn aims to replace the older modules and functions. Envoy is subprocess for humans. Example usage from the readme:
Pipe stuff around too:
|
|||||||||
|
|
There is also Plumbum
|
|||||
|
|
Get more information here. |
||||
|
|
|
https://docs.python.org/2/library/subprocess.html ...or for a very simple command:
|
||||
|
|
|
|
|||||||||
|
|
There are lots of different libraries which allow you to call external commands with python. For each library I've given a description and shown an example of calling an external command, the command I used as the example is
Hopefully this will help you make a decision on which library to use :)
Subprocess allows you to call external commands and connect them to their input/output/error pipes (stdin, stdout and stderr). Subprocess is the default choice for running commands, but sometimes other modules are better.
os is used for "operating system dependent functionality". It can also be used to call external commands with
sh is a subprocess interface which lets you call programs as if they were functions, this is useful if you want to run a command multiple times.
plumbum is a library for "script-like" python programs. You can call programs like functions as in sh. Plumbum is useful if you want to run a pipeline without the shell.
pexpect lets you spawn child applications, control them and find patterns in their output. This is a better alternative to subprocess for commands that expect a tty on unix.
fabric is a Python 2.5 and 2.7 library, it allows you to execute local and remote shell commands. Fabric is simple alternative for running commands in a secure shell (SSH)
envoy is known as "subprocess for humans", it is used as a convenience wrapper around the
commands contains wrapper functions for [EDIT] Based on J.F. Sebastian's comment |
|||||||||||||||||
|
|
There is another difference here which is not mentioned above.
I tried subprocess, execution was successful. However could not comm w/ . everything normal when I run both from the terminal. One more: (NOTE: kwrite behaves different from other apps. If you try below with firefox results will not be the same) If you try Anyone runs the kwrite not being a subprocess (i.e. at the system monitor it must be appear at the leftmost edge of the tree) |
||||
|
|
|
|
|||
|
|
|
|
||||
|
|
os - This module provides a portable way of using operating system dependent functionality for the more os functions here is the documentation. |
|||||||||||||
|
|
I tend to use subprocess together with shlex (to handle escaping of quoted strings):
|
|||
|
|
|
Shameless plug, I wrote a library for this :P https://github.com/houqp/shell.py It's basically a wrapper for popen and shlex for now. It also supports piping commands so you can chain commands easier in Python. So you can do things like:
|
|||
|
|
|
you can use Popen, then you can check procedure's status
Check this out subprocess.Popen |
|||
|
|
|
Here are my 2 cents: In my view this is best practice when dealing with external commands... This is return values from execute method...
This is execute method...
|
|||||
|
|
To fetch the network id from the openstack neutron:
Output of nova net-list
Output of print(networkId)
|
|||
|
|
|
Just to add to the discussion, if you include using a Python console, you can call external commands from ipython. While in the ipython prompt, you can call call shell commands by prefixing '!'. You can also combine python code with shell, and assign the output of shell scripts to python variables. For instance:
|
|||
|
|
|
There are a lot of different ways to run external commands in python, and all of them have their own plus sides and drawbacks. My colleagues and me have been writing python sysadmin tools, so we need to run a lot of external commands, and sometimes you want them to block or run asynchronously, time-out, update every second... There are also different ways of handling the return code and errors, and you might want to parse the output, and provide new input (in an expect kind of style) Or you will need to redirect stdin, stdout and stderr to run in a different tty (e.g., when using screen) So you will probably have to write a lot of wrappers around the external command. So here is a python module which we have written which can handle almost anything you would want, and if not, it's very flexible so you can easily extend it: https://github.com/hpcugent/vsc-base/blob/master/lib/vsc/utils/run.py |
||||
|
|
protected by Martijn Pieters♦ Apr 16 '13 at 20:23
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
for line in proc.stdout:(orfor line in iter(proc.stdout.readline, '')in Python 2) instead of (moronic)for line in proc.stdout.readlines():. See Python: read streaming input from subprocess.communicate() – J.F. Sebastian Jun 12 '15 at 18:41