python - Is it possible to apply a callable object to a pool of processes? -
whenever phone call function pool.apply_async
, have pass function ran process. tried pass instead callable object, did nothing. there way it? or have design pool myself scratch?
the code follows:
import queue class taskthread(object): def __init__(self): #self.queue=queue.queue() def __call__(self): print("in taskthread.__call__") #self.queue.put(1) pool=multiprocessing.pool(4) task=taskthread() pool.apply_async(target=task)
something that.
the problem didn't phone call get()
on asyncresult
returned apply_async
, nor did utilize pool.close
/pool.join()
wait until kid process done working before exiting main process. since worker processes within of pool
daemons terminated main process exits. means illustration programme exits (and takes children it) before kid process can print out. can prepare calling .get()
on asyncresult
, or adding close()
/join()
calls:
class taskthread(object): def __call__(self): print("in taskthread.__call__") pool=multiprocessing.pool(4) task=taskthread() pool.apply_async(task) pool.close() pool.join()
or:
class taskthread(object): def __call__(self): print("in taskthread.__call__") pool=multiprocessing.pool(4) task=taskthread() result = pool.apply_async(task) result.get()
edit:
in order pass queue
way you're trying, you'd need this:
import multiprocessing class taskthread(object): def __init__(self, manager): self.queue = manager.queue() def __call__(self): print("in taskthread.__call__") self.queue.put(1) if __name__ == "__main__": pool=multiprocessing.pool(4) m = multiprocessing.manager() task=taskthread(m) result = pool.apply_async(task) result.get() print(task.queue.get())
output:
in taskthread.__call__ 1
python multiprocessing python-multiprocessing
No comments:
Post a Comment