Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Python Multiprocessing doesn't seem to end

I have a process which moves lots of data into a database. I use multiprocessing for this.

It runs nice and quickly, but even when it’s finished (all the rows are moved), it doesn’t seem to end.

I’ve added join as I thought it means that the process won’t terminate until all other processes are complete

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Is there something I have missed here? why doesn’t it end?

p=mp.Pool(cpu_count())
p.map(do_process, result)
p.close()
p.join()

>Solution :

So a nicer way of doing it, it to use joblib:

import joblib

with joblib.parallel_backend('loky'):
    results = joblib.Parallel(n_jobs=-1)(
        joblib.delayed(do_process)(item) 
        for item in result
        )

I am assuming here that that the result object is mapped over the do_process function. The with statement ensure closing the loop. If you want to know of what is happening, then you can add a verbosity setting to the joblib.Parralel object.

Hope it helps

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading