Friday 15 January 2010

python - Wrap Multiprocess Pool Inside Loop (Shared Memory Between Processes) -



python - Wrap Multiprocess Pool Inside Loop (Shared Memory Between Processes) -

i'm using python bundle "deap" solve multiobjective optimization problems genetic algorithms. functions can quite expensive, , because of evolutionary nature of ga, gets compounded pretty quick. bundle does have back upwards allow evolutionary computations parallelized multiprocess.

however, i'd go 1 step farther , run optimization multiple times, different values on of optimization parameters. instance, might want solve optimization problem different values of weights.

this seems pretty natural case loops, problem these parameters must defined in global scope of programme (i.e., above "main" function) sub-processes know parameters. here's pseudo-code:

# define deap parameters - have in global scope toolbox = base.toolbox() history = tools.history() weights = [1, 1, -1] # want vary creator.create("fitness",base.fitness, weights=weights) creator.create("individual", np.ndarray, fitness=creator.fitness) def main(): # run ga solve multiobjective optimization problem homecoming my_optimized_values if __name__=='__main__': ## i'd can't ## ## all_weights = list(itertools.product([1, -1],repeat=3)) ## combo in all_weights: ## weights = combo ## pool = multiprocessing.pool(processes=6) # can downwards here, , distributes ga computations pool of workers toolbox.register("map",pool.map) my_values = main()

i've investigated various possibilities, multiprocessing.value, pathos fork of multiprocessing, , others, in end there's problem kid processes reading individual class.

i've posed question on deap users' group, it's not big community so. plus, seems me more of general conceptual python question specific issue deap. current solution problem run code multiple times , alter of parameter definitions each time. @ to the lowest degree way ga calculations still parallelized, require more manual intervention i'd like.

any advice or suggestions appreciated!

use initializer/initargs keyword arguments pool pass different values global variables need alter on each run. initializer function called initargs arguments each worker process within of pool, starts up. can set global variables desired values there, , they'll set within each kid lifetime of pool.

you'll need create different pool each run, shouldn't problem:

toolbox = base.toolbox() history = tools.history() weights = none # we'll set in children later. def init(_weights): # run in each kid process. global weights weights = _weights creator.create("fitness",base.fitness, weights=weights) creator.create("individual", np.ndarray, fitness=creator.fitness) if __name__=='__main__': all_weights = list(itertools.product([1, -1],repeat=3)) combo in all_weights: weights = combo pool = multiprocessing.pool(processes=6, initializer=init, initargs=(weights,)) toolbox.register("map",pool.map) my_values = main() pool.close() pool.join()

python multiprocessing python-multiprocessing

No comments:

Post a Comment