Saturday 15 February 2014

python - How to use multiprocessing in function? -



python - How to use multiprocessing in function? -

i want define function in "a.py" uses multiprocessing parallelization, import in "b.py" library function. example, in "a.py":

import multiprocessing mp, queue def mpworker(input, i): input.put(i) def mptest(maxmpnum): jobs = [] batchresult = queue.queue() in range(maxmpnum): p = mp.process(target=mpworker, args=(batchresult, + 1)) p.start() print("this is", i) jobs.append(p) in range(maxmpnum): print("getting", i) result = batchresult.get() print(result)

then in "b.py":

import a.mptest(10)

however, won't work, error:_pickle.picklingerror: can't pickle : attribute lookup lock on _thread failed. so, possible utilize multiprocessing of python way or missing anything?

the entire traceback, edited (python 3.x, windows):

traceback (most recent phone call last): file "f:/b.py", line 72, in <module> a.mptest(5) file "f:\a.py", line 566, in mptest p.start() file "c:\python34\lib\multiprocessing\process.py", line 105, in start self._popen = self._popen(self) file "c:\python34\lib\multiprocessing\context.py", line 212, in _popen homecoming _default_context.get_context().process._popen(process_obj) file "c:\python34\lib\multiprocessing\context.py", line 313, in _popen homecoming popen(process_obj) file "c:\python34\lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__ reduction.dump(process_obj, to_child) file "c:\python34\lib\multiprocessing\reduction.py", line 59, in dump forkingpickler(file, protocol).dump(obj) _pickle.picklingerror: can't pickle <class '_thread.lock'>: attribute lookup lock on _thread failed

the problem you're using queue.queue, works between threads in same process, instead of multiprocessing.queue, works between processes.

depending on platform, , way utilize it, fail in different ways. in case, because you're trying pass queue argument process constructor, , you're on windows, nicest error: seek pickle queue itself, , fails.* on unix, may pass queue kid processes, either lose of values (os x) or deadlock (most other systems) when utilize it.

as the docs explain, multiprocessing.queue "is near clone of queue.queue", except it's "thread , process safe" instead of thread safe.

if thought were using multiprocessing.queue, error in line:

import multiprocessing mp, queue

this doesn't import multiprocessing both mp , queue, imports multiprocessing mp, , imports queue itself. see reference on import details.

the fact ambiguous human (even though it's not ambiguous parser) 1 of reasons multi-import statements discouraged in python. example, pep 8 says "imports should on separate lines".

* might nicer if queue raised exception when pickled it, instead of relying on fact uses unpicklable threading synchronization objects, because it's not exclusively obvious pickling _thread.lock caused pickling queue.queue.

python python-3.x multiprocessing

No comments:

Post a Comment