python - How to create an infinite loop with apply_async? -


i havea pool of processes apply_async in different processes take different times provide output. once 1 process finished calculations output. after want launch process. in way want create infinite loop launches processes, reads output of finished process, calculations , relaunches process.

so far have been able want except main process gets stuck in get() function. because don't know process terminated , hence entry of results should get().

some attempt code:

import multiprocessing mp import numpy np time import sleep   def squared(x,y):     result = np.array((x,x))     if x%2 == 0:     sleep(2)  return result     if __name__ == "__main__":      pool = mp.pool()       pool_r = []     in xrange(0,8):         pool_r.append(pool.apply_async(squared, (i,i)))      count_results = 0      j in xrange(0,10):         result = pool_r[count_results].get()         print result         count_results += 1         pool_r.append(pool.apply_async(squared, (j,j)))      pool.close()     pool.join() 

and output is: [0 0] [1 1] [2 2] [3 3] [4 4] [5 5] [6 6] [7 7] [0 0] [1 1]

instead of odd numbers first , ones (since these ones have sleep).

any suggestions?


thank fast reply abarnert.

in reality want keep infinite loop after processes completed (i need results able enter loop).

q1 - if create pool 30 works can submit more 30 processes? computer wait 1 finish put work?

q2 - in code there callback function. however, code need run when 1 worker finishes has in main process since have update variables sent new processes create.

q3 - code main process takes, let 10% of time processes need realize tasks. approach have main process realize calculations , launch new processes?

q4 - right if ctrl+c code terminates when processes over. can able terminate code ctrl+c? , finally, after comment think futures still way go?

some pseudo-code need:

launch several processes wait results  launch several processes  while true:      results finished process      calculations      launch 2 more processes      # ending condition 

the problem you're waiting results in order jobs issued, rather order finished. so, if job 1 finishes before job 0, doesn't matter; you're still waiting on job 0.

fundamentally, problem apply_async returns asyncresult objects, not composable futures, want use them if were. can't that. there's no way wait on bunch of asyncresults in parallel until 1 of them finishes. if want that, use concurrent.futures instead—or, python 2.7, backport on pypi, futures; can call wait on sequence of futures, or iterate on as_completed.

you can simulate on top of asyncresult using callbacks instead of wait, that's making life harder needs be, because have turn flow of control inside-out. like:

pool = mp.pool()   count_results = 0 def handle_result(result):     global count_results, done     print result     if count_results < 10:         pool.apply_async(squared, (count_results, count_results),                          callback=handle_result)     elif count_results == 18:         pool.close()     count_results += 1  in xrange(0,8):     pool.apply_async(squared, (i,i), callback=handle_result)  pool.join() 

Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -