Python’s multiprocessing module allows you to run code in multiple processes, taking advantage of multiple CPU cores. Unlike threading, which is limited by the Global Interpreter Lock (GIL) for CPU-bound tasks, multiprocessing starts separate Python interpreter processes that can truly execute in parallel.
Use multiprocessing when you want to speed up heavy CPU work such as:
Process class: Low-level API to start and manage processes manually.Pool class: Higher-level API to run a function on many inputs in parallel.Queue, Pipe, or shared memory to exchange data between processes.from multiprocessing import Processp = Process(target=func, args=(arg1, arg2))p.start() – starts the new process.p.join() – waits for the process to finish.from multiprocessing import Poolwith Pool(processes=n) as pool:results = pool.map(func, iterable)Queue – safe FIFO queue shared between processes.Pipe – two-way communication channel.Value, Array, Manager – share data structures safely.multiprocessing.cpu_count() – number of available cores.set_start_method() – controls how new processes are spawned.spawn, fork, forkserver (platform dependent).
from multiprocessing import Process
import time
def worker(name, delay):
print(f"[{name}] starting work")
time.sleep(delay)
print(f"[{name}] finished after {delay} seconds")
if __name__ == "__main__":
p1 = Process(target=worker, args=("Process-1", 2))
p2 = Process(target=worker, args=("Process-2", 3))
p1.start() # run in parallel
p2.start()
p1.join() # wait for p1 to finish
p2.join()
print("All processes completed")
from multiprocessing import Pool, cpu_count
import math
import time
# Check if a number is prime
def is_prime(n: int) -> bool:
if n < 2:
return False
if n % 2 == 0 and n != 2:
return False
limit = int(math.sqrt(n)) + 1
for i in range(3, limit, 2):
if n % i == 0:
return False
return True
if __name__ == "__main__":
# Large numbers to test for primality
numbers = [10_000_019, 10_000_033, 10_000_079, 10_000_081]
print(f"CPU cores available: {cpu_count()}")
start = time.perf_counter()
# Use a Pool to distribute work across processes
with Pool() as pool:
results = pool.map(is_prime, numbers)
end = time.perf_counter()
for n, r in zip(numbers, results):
print(f"{n} prime? {r}")
print(f"Completed in {end - start:.3f} seconds using multiprocessing")
Queue
from multiprocessing import Process, Queue
def square_worker(numbers, queue):
for n in numbers:
queue.put((n, n * n))
queue.put(None) # sentinel value to mark end
if __name__ == "__main__":
nums = [1, 2, 3, 4, 5]
q = Queue()
p = Process(target=square_worker, args=(nums, q))
p.start()
while True:
item = q.get()
if item is None:
break
n, sq = item
print(f"{n} squared is {sq}")
p.join()
print("Done reading from queue")
# Pseudocode / structure:
# 1. Define a CPU-heavy function.
# 2. Run it in a simple for-loop (sequential), measure time.
# 3. Run it again using a Pool.map, measure time.
# 4. Compare results to see speedup on multi-core CPUs.
Process example:
p1 and p2) run the worker() function independently.join() ensures the main process waits for them before printing the final message.Pool.map() example:
is_prime() function checks whether a number is prime.Pool.map() distributes the list of numbers across worker processes.Queue communication example:
None) indicates that there is no more data.if __name__ == "__main__":, especially on Windows.Pool for simple “map this function over a list” style problems.Queue or Manager for coordinated communication between processes.cpu_count().Pool.map(). Compare the execution times for a large list of numbers.Queue, where they are written to a single log file.Run your scripts from the command line with: python multiprocessing_demo.py