Join the Stack Overflow Community
Stack Overflow is a community of 6.6 million programmers, just like you, helping each other.
Join them; it only takes a minute:
Sign up

I want to write a function that will execute a shell command and return its output as a string, no matter, is it an error or success message. I just want to get the same result that I would have gotten with the command line.

What would be a code example that would do such a thing?

For example:

def run_command(cmd):
    # ??????

print run_command('mysqladmin create test -uroot -pmysqladmin12')
# Should output something like:
# mysqladmin: CREATE DATABASE failed; error: 'Can't create database 'test'; database exists'
share|improve this question
1  
related: stackoverflow.com/questions/2924310/… – J.F. Sebastian Jan 24 '11 at 9:22

11 Answers 11

up vote 376 down vote accepted

For convenience, Python 2.7 provides the

subprocess.check_output(*popenargs, **kwargs)  

function, which takes the same arguments as Popen, but returns a string containing the program's output. You can pass stderr=subprocess.STDOUT to ensure that error messages are included in the returned output -- but don't pass stderr=subprocess.PIPE to check_output. It can cause deadlocks. If you need to pipe from stderr, see the Popen example below.

If you're using an older python, Vartec's method will work. But the better way to go -- at least in simple cases that don't require real-time output capturing -- is to use communicate. As in:

output = subprocess.Popen(["mycmd", "myarg"], stdout=subprocess.PIPE).communicate()[0]

Or

>>> import subprocess
>>> p = subprocess.Popen(['ls', '-a'], stdout=subprocess.PIPE, 
...                                    stderr=subprocess.PIPE)
>>> out, err = p.communicate()
>>> print out
.
..
foo

If you set stdin=PIPE, communicate also allows you to pass data to the process via stdin:

>>> cmd = ['awk', 'length($0) > 5']
>>> p = subprocess.Popen(cmd, stdout=subprocess.PIPE,
...                           stderr=subprocess.PIPE,
...                           stdin=subprocess.PIPE)
>>> out, err = p.communicate('foo\nfoofoo\n')
>>> print out
foofoo

Finally, note Aaron Hall's answer, which indicates that on some systems, you may need to set stdout, stderr, and stdin all to PIPE (or DEVNULL) to get communicate to work at all.

share|improve this answer
2  
Yes, I saw this one, but I use 2.6 (my mistake not to mention python version) – Silver Light Jan 21 '11 at 15:49
3  
Both with check_output() and communicate() you have to wait until the process is done, with poll() you're getting output as it comes. Really depends what you need. – vartec Apr 5 '12 at 9:44
1  
This answer worked for me where others didn't. – Cody Brown Mar 18 '13 at 19:54
1  
Not sure if this only applies to later versions of Python, but the variable out was of type <class 'bytes'> for me. In order to get the output as a string I had to decode it before printing like so: out.decode("utf-8") – PolyMesh Oct 31 '13 at 19:42
1  
@Elena, you're right, I wasn't explicit. I meant don't pass stderr=subprocess.PIPE to check_output! Passing it to Popen is fine. I edited the answer; let me know if it seems clear enough now. – senderle Jun 14 '15 at 12:19

This is way easier, but only works on Unix (including Cygwin).

import commands
print commands.getstatusoutput('wc -l file')

it returns a tuple with the (return_value, output)

share|improve this answer
1  
simple but effective – rikAtee May 20 '12 at 21:54
20  
Deprecated now, but very useful for old python versions without subprocess.check_output – static_rtti Jun 13 '12 at 8:20
11  
Note that this is Unix-specific. It will for example fail on Windows. – Zitrax Jan 21 '13 at 9:50
2  
+1 I have to work on ancient version of python 2.4 and this was VERY helpful – javadba Mar 14 '14 at 22:14
1  
nicer than others. deprecated in 3.x which no one here uses – Erik Aronesty May 12 '15 at 14:06

Something like that:

def runProcess(exe):    
    p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    while(True):
      retcode = p.poll() #returns None while subprocess is running
      line = p.stdout.readline()
      yield line
      if(retcode is not None):
        break

Note, that I'm redirecting stderr to stdout, it might not be exactly what you want, but I want error messages also.

This function yields line by line as they come (normally you'd have to wait for subprocess to finish to get the output as a whole).

For your case the usage would be:

for line in runProcess('mysqladmin create test -uroot -pmysqladmin12'.split()):
    print line,
share|improve this answer
1  
Thank you for your help! But function goes into an infinite loop for me... – Silver Light Jan 21 '11 at 15:17
3  
-1: it is an infinite loop the if retcode is 0. The check should be if retcode is not None. You should not yield empty strings (even an empty line is at least one symbol '\n'): if line: yield line. Call p.stdout.close() at the end. – J.F. Sebastian Jan 24 '11 at 9:37
1  
I tried the code with ls -l /dirname and it breaks after listing two files while there are much more files in the directory – Vasilis Sep 30 '13 at 20:01
1  
@Vasilis: check similar answer – J.F. Sebastian Nov 13 '13 at 1:28
3  
@fuenfundachtzig: .readlines() won't return until all output is read and therefore it breaks for large output that does not fit in memory. Also to avoid missing buffered data after the subprocess exited there should be an analog of if retcode is not None: yield from p.stdout.readlines(); break – J.F. Sebastian Dec 21 '13 at 5:15

Vartec's answer doesn't read all lines, so I made a version that did:

def run_command(command):
    p = subprocess.Popen(command,
                         stdout=subprocess.PIPE,
                         stderr=subprocess.STDOUT)
    return iter(p.stdout.readline, b'')

Usage is the same as the accepted answer:

command = 'mysqladmin create test -uroot -pmysqladmin12'.split()
for line in run_command(command):
    print(line)
share|improve this answer
6  
you could use return iter(p.stdout.readline, b'') instead of the while loop – J.F. Sebastian Nov 22 '12 at 15:44
1  
That is a pretty cool use of iter, didn't know that! I updated the code. – Max Ekman Nov 28 '12 at 21:53
    
I'm pretty sure stdout keeps all output, it's a stream object with a buffer. I use a very similar technique to deplete all remaining output after a Popen have completed, and in my case, using poll() and readline during the execution to capture output live also. – Max Ekman Nov 28 '12 at 21:55
    
I've removed my misleading comment. I can confirm, p.stdout.readline() may return the non-empty previously-buffered output even if the child process have exited already (p.poll() is not None). – J.F. Sebastian Sep 18 '14 at 3:12
    
This code doesn't work. See here stackoverflow.com/questions/24340877/… – thang May 3 '15 at 6:00

Your Mileage May Vary, I attempted @senderle's spin on Vartec's solution in Windows on Python 2.6.5, but I was getting errors, and no other solutions worked. My error was: WindowsError: [Error 6] The handle is invalid.

I found that I had to assign PIPE to every handle to get it to return the output I expected - the following worked for me.

import subprocess

def run_command(cmd):
    """given shell command, returns communication tuple of stdout and stderr"""
    return subprocess.Popen(cmd, 
                            stdout=subprocess.PIPE, 
                            stderr=subprocess.PIPE, 
                            stdin=subprocess.PIPE).communicate()

and call like this, ([0] gets the first element of the tuple, stdout):

run_command('tracert 11.1.0.1')[0]

After learning more, I believe I need these pipe arguments because I'm working on a custom system that uses different handles, so I had to directly control all the std's.

To stop console popups (with Windows), do this:

def run_command(cmd):
    """given shell command, returns communication tuple of stdout and stderr"""
    # instantiate a startupinfo obj:
    startupinfo = subprocess.STARTUPINFO()
    # set the use show window flag, might make conditional on being in Windows:
    startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
    # pass as the startupinfo keyword argument:
    return subprocess.Popen(cmd,
                            stdout=subprocess.PIPE, 
                            stderr=subprocess.PIPE, 
                            stdin=subprocess.PIPE, 
                            startupinfo=startupinfo).communicate()

run_command('tracert 11.1.0.1')
share|improve this answer
1  
Interesting -- this must be a Windows thing. I'll add a note pointing to this in case people are getting similar errors. – senderle May 1 '14 at 14:04
    
use DEVNULL instead of subprocess.PIPE if you don't write/read from a pipe otherwise you may hang the child process. – J.F. Sebastian Sep 9 '14 at 10:57
    
Sounds like a good tip, @J.F.Sebastian – Aaron Hall Sep 18 '14 at 1:45

Modern Python solution (>= 3.1):

 res = subprocess.check_output(lcmd, stderr=subprocess.STDOUT)
share|improve this answer
4  
As the accepted answer says, check_output() is available since Python 2.7. – J.F. Sebastian Apr 21 '14 at 17:13

In Python 3.5:

import subprocess

output = subprocess.run("ls -l", shell=True, stdout=subprocess.PIPE, 
                        universal_newlines=True)
print(output.stdout)
share|improve this answer

I had a slightly different flavor of the same problem with the following requirements:

  1. Capture and return STDOUT messages as they accumulate in the STDOUT buffer (i.e. in realtime).
    • @vartec solved this Pythonically with his use of generators and the 'yield'
      keyword above
  2. Print all STDOUT lines (even if process exits before STDOUT buffer can be fully read)
  3. Don't waste CPU cycles polling the process at high-frequency
  4. Check the return code of the subprocess
  5. Print STDERR (separate from STDOUT) if we get a non-zero error return code.

I've combined and tweaked previous answers to come up with the following:

def run_command(command):
    p = subprocess.Popen(command,
                         stdout=subprocess.PIPE,
                         stderr=subprocess.PIPE,
                         shell=True)
    # Read stdout from subprocess until the buffer is empty !
    for line in iter(p.stdout.readline, b''):
        if line: # Don't print blank lines
            yield line
    # This ensures the process has completed, AND sets the 'returncode' attr
    while p.poll() is None:                                                                                                                                        
        sleep(.1) #Don't waste CPU-cycles
    # Empty STDERR buffer
    err = p.stderr.read()
    if p.returncode != 0:
       # The run_command() function is responsible for logging STDERR 
       print "Error: " + err                                                                                                                                                                                                                                                                                                                                      

This code would be executed the same as previous answers:

for line in run_command(cmd):
    print line
share|improve this answer

I had the same problem But figured out a very simple way of doing this follow this

import subprocess
Input = subprocess.getoutput("ls -l")
print(Input)

Hope it helps out

share|improve this answer
    
How does this solve the OP's problem? Please elaborate. – RamenChef Nov 12 '16 at 20:21
    
It returns the output of command as string, as simple as that – Azhar Khan Dec 4 '16 at 7:55
1  
Doesn't work on Python 2 – Allan Deamon Jan 15 at 17:45
    
Of course, print is a statement on Python 2. You should be able to figure out this is a Python 3 answer. – Dev 11 hours ago

This is a tricky solution that I think it's works in many situations.

import os
os.system('sample_cmd > tmp')
print open('tmp', 'r').read()

A temporary file(here is tmp) is created with the output of command and you can read from it your desired output.

share|improve this answer
    
Hacky but super simple + works anywhere .. can combine it with mktemp to make it work in threaded situations I guess – Prakash Rajagaopal Oct 18 '16 at 1:32

If you need to run a shell command on multiple files, this did the trick for me.

import os
import subprocess

# Define a function for running commands and capturing stdout line by line
# (Modified from Vartec's solution because it wasn't printing all lines)
def runProcess(exe):    
    p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
    return iter(p.stdout.readline, b'')

# Get all filenames in working directory
for filename in os.listdir('./'):
    # This command will be run on each file
    cmd = 'nm ' + filename

    # Run the command and capture the output line by line.
    for line in runProcess(cmd.split()):
        # Eliminate leading and trailing whitespace
        line.strip()
        # Split the output 
        output = line.split()

        # Filter the output and print relevant lines
        if len(output) > 2:
            if ((output[2] == 'set_program_name')):
                print filename
                print line

Edit: Just saw Max Persson's solution with J.F. Sebastian's suggestion. Went ahead and incorporated that.

share|improve this answer

protected by J.F. Sebastian Dec 28 '14 at 12:47

Thank you for your interest in this question. Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).

Would you like to answer one of these unanswered questions instead?

Not the answer you're looking for? Browse other questions tagged or ask your own question.