Utilizing The Multi Processing Tool With Numpy Python
I am trying to utilize multi processing tool with numpy arrays. I want to run func1() and func2() simultaneously and then use np.concatenate() and place Solution_1 and Solution_2 t
Solution 1:
For example:
import multiprocessing as mp
import numpy as np
num_cores = mp.cpu_count()
Numbers = np.array([1,2,3,4,5,6,7,8,9,10,11,12])
def func1():
Solution_1 = Numbers + 10return Solution_1
def func2():
Solution_2 = Numbers * 10return Solution_2
# Getting ready my cores, I left one aside
pool = mp.Pool(num_cores-1)
# This is to use all functions easily
functions = [func1, func2]
# This is to store the results
solutions = []
forfunctioninfunctions:
solutions.append(pool.apply(function, ()))
solutions
looks like:
[array([11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22]),
array([ 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 110, 120])]
You could place that directly in a dataframe:
solutions = pd.DataFrame(solutions)
print(solutions)
0 1 2 3 4 5 6 7 8 9 10 11
0 11 12 13 14 15 16 17 18 19 20 21 22
1 10 20 30 40 50 60 70 80 90 100 110 120
I see in a similar cases / alternatives:
- Global Variables, this is very similar to the original case post here: https://stackoverflow.com/a/40399562/7127519
apply_async
: https://stackoverflow.com/a/48933782/7127519- A shared variable to communicate: How can I recover the return value of a function passed to multiprocessing.Process?
multiprocessing.Manager()
: https://stackoverflow.com/a/10415215/7127519
And others.
Post a Comment for "Utilizing The Multi Processing Tool With Numpy Python"