Threading and forking combined#

Mixing multiprocessing and threading is generally problematic and a recipe for deadlocks.

The following code was entered in 2016 at https://bugs.python.org/issue27422 in the Python bug tracker:

[1]:
import multiprocessing
import subprocess
import sys

from concurrent.futures import ThreadPoolExecutor


def run(arg):
    print("starting %s" % arg)
    p = multiprocessing.Process(target=print, args=("running", arg))
    p.start()
    p.join()
    print("finished %s" % arg)


if __name__ == "__main__":
    n = 16
    tests = range(n)
    with ThreadPoolExecutor(n) as pool:
        for r in pool.map(run, tests):
            pass
starting 0starting 1

starting 2
starting 3
starting 4
starting 5
starting 6
starting 7
starting 8
starting 9
starting 10
starting 11
starting 12
starting 13
starting 14
starting 15
running 0
finished 4
running 4
running 1
running 2
running 3
running 15
finished 15
finished 0
finished 3
finished 1
finished 2
running 5
finished 5
running 7
running 6
finished 7
running 11
finished 6
running 9
finished 11
finished 9
running 8
running 10
finished 8
finished 10
running 13
finished 13
running 12
running 14
finished 12
finished 14

Usually, threading is recommended after the fork, not before. Otherwise, the locks used when executing the threads are duplicated across the processes. If one of these processes dies with a lock, all other processes with this lock are deadlocked.