python - Grab output from shell command which is run in the background -


i saw useful information in this post how can't expect run process in background if retrieving output using subprocess. problem ... want do!

i have script drops commands various hosts via ssh , don't want have wait on each 1 finish before starting next. ideally, have this:

for host in hostnames:   p[host] = popen(["ssh", mycommand], stdout=pipe, stderr=pipe)   pout[host], perr[host] = p[host].communicate() 

which have (in case mycommand takes long time) of hosts running mycommand @ same time. now, appears entirety of ssh command finishes before starting next. (according previous post linked) due fact capturing output, right? other cating output file , reading output later, there decent way make these things happen on various hosts in parallel?

you may want use fabric this.

fabric python (2.5-2.7) library , command-line tool streamlining use of ssh application deployment or systems administration tasks.

example file:

from fabric.api import run, env  def do_mycommand():     my_command = "ls" # change command     output = run(mycommand)     print "output of %s on %s:%s" % (mycommand, env.host_string, output) 

now execute on hosts (host1,host2 ... hosts go):

fab -h host1,host2 ... do_mycommand 

Comments

Popular posts from this blog

python - Specify path of savefig with pylab or matplotlib -

How to run C# code using mono without Xamarin in Android? -

c# - SharpSsh Command Execution -