How to Run Multiple Python Files Concurrently or Sequentially in Python
Sometimes you need to execute multiple Python scripts as part of a larger workflow. You might want them to run simultaneously (concurrently) to save time, or perhaps one after the other (sequentially) because they depend on each other's output.
This guide demonstrates various methods in Python and from the command line to run multiple .py
files concurrently or sequentially, focusing on the subprocess
module.
Setup Necessary to explain next Examples
Assume we have three simple files: a.py
, b.py
, c.py
. These will be useful in next examples.
# a.py
import time
print('Executing Script A...')
time.sleep(1) # Simulate work
print('Finished Script A.')
# b.py
import time
print('Executing Script B...')
time.sleep(1) # Simulate work
print('Finished Script B.')
# c.py
import time
print('Executing Script C...')
time.sleep(1) # Simulate work
print('Finished Script C.')
Understanding Concurrent vs. Sequential Execution
- Concurrent: Multiple scripts start running at roughly the same time, potentially overlapping their execution. This is useful for tasks that can run independently and can speed up the overall process if the system has enough resources (CPU cores). The order of output might vary between runs.
- Sequential: The second script starts only after the first script has completely finished, the third starts after the second finishes, and so on. This is necessary when scripts depend on the results or side effects of previous scripts. The order of execution is predictable.
Running Files Concurrently (Simultaneously)
Using subprocess.Popen
in Python (Recommended)
The subprocess.Popen
class is designed to create and manage child processes. You start each script as a separate process without waiting for it to finish immediately.
# main_concurrent_popen.py
import subprocess
import sys
# Determine the correct Python executable (python, python3, py)
python_executable = sys.executable # Usually the safest way to get the current interpreter
print(f"Using Python executable: {python_executable}")
# List of scripts to run
scripts = ['a.py', 'b.py', 'c.py']
processes = []
print("Starting processes concurrently...")
# Create and start each process
for script in scripts:
# Popen starts the process and returns immediately
proc = subprocess.Popen([python_executable, script])
processes.append(proc)
print(f" Started {script} with PID: {proc.pid}")
print("Waiting for processes to finish...")
# Optionally, wait for all processes to complete
for proc in processes:
proc.wait() # Wait for this specific process to terminate
print(f" Process {proc.pid} finished with code: {proc.returncode}")
print("All concurrent processes finished.")
subprocess.Popen([python_executable, script])
: Startsscript
using the specifiedpython_executable
in a new process. It does not wait for the script to finish.- We store the
Popen
objects in theprocesses
list. proc.wait()
: Waits for a specific child process to terminate. Calling this after starting all processes allows them to run concurrently before the main script waits for completion.- Output Order: Because the scripts run concurrently, the order of their
print
statements in the final output can vary each time you runmain_concurrent_popen.py
.
Using the Command Line (&
operator)
Many shells (like Bash, Zsh on Linux/macOS) allow you to run a command in the background using the ampersand (&
) symbol. This lets the shell start the next command immediately.
# Run from your terminal
python a.py & python b.py & python c.py
# Or to run the whole line in the background:
# python a.py & python b.py & python c.py &
- Each
python script.py &
starts a process in the background. - This is simple but offers less control from within a Python script compared to
subprocess.Popen
.
This exact syntax might not work identically on Windows Command Prompt (though PowerShell has similar background job features).
Using a Bash Script (&
operator)
You can put the command-line approach into a shell script.
# my_concurrent_script.sh
#!/bin/bash
echo "Starting scripts concurrently via bash..."
python a.py &
python b.py &
python c.py &
echo "Bash script initiated processes. Waiting for them (optional)..."
wait # Optional: Bash command to wait for background jobs started in this script
echo "Bash script finished."
Run it from the terminal: bash my_concurrent_script.sh
.
Running Files Sequentially (One After Another)
Using subprocess.run
in Python (Recommended)
The subprocess.run()
function is designed to run a command and wait for it to complete before proceeding. This makes it ideal for sequential execution.
# main_sequential_run.py
import subprocess
import sys
python_executable = sys.executable
scripts = ['a.py', 'b.py', 'c.py']
print("Running processes sequentially...")
for script in scripts:
print(f"--- Running {script} ---")
# run() executes the command and WAITS for it to complete
result = subprocess.run([python_executable, script], capture_output=True, text=True, check=False)
print(f"--- Finished {script} ---")
print(f" Return Code: {result.returncode}")
print(f" Stdout:{result.stdout}")
if result.stderr:
print(f" Stderr:{result.stderr}")
# Optional: Stop if a script fails (non-zero return code)
# if result.returncode != 0:
# print(f"Error: {script} failed with code {result.returncode}. Stopping.")
# break
print("All sequential processes finished.")
subprocess.run(...)
: Executes the command (python script.py
) and blocks until the script finishes.capture_output=True, text=True
: Captures the script's standard output and standard error as strings.check=False
: By default (False
),run
doesn't raise an exception if the script returns a non-zero exit code (indicating an error). Setcheck=True
to automatically raiseCalledProcessError
on failure.- The output will always show "Finished Script A" before "Executing Script B...", etc.
Using the Command Line / Bash Script (No &
)
Simply listing the commands one after another in the terminal or a shell script makes them run sequentially.
- Command Line:
python a.py
python b.py
python c.py - Bash Script:
Run with
# my_sequential_script.sh
#!/bin/bash
echo "Running scripts sequentially..."
python a.py
echo "---"
python b.py
echo "---"
python c.py
echo "Finished sequential script."bash my_sequential_script.sh
.
Using os.system
in Python (Discouraged)
The os.system()
function executes a command in a subshell. It waits for completion, making it sequential. However, os.system
is generally discouraged in favor of the subprocess
module because:
- It offers less control over the executed process (input/output redirection is harder).
- It can pose security risks if command strings include untrusted input.
- Error handling is less robust (mainly relies on checking the return code).
# main_sequential_os_system.py (Use subprocess instead!)
import os
scripts = ['a.py', 'b.py', 'c.py']
python_command = "python" # Or python3, py
print("Running sequentially with os.system (discouraged)...")
for script in scripts:
print(f"--- Running {script} ---")
command = f"{python_command} {script}"
return_code = os.system(command)
print(f"--- Finished {script} with code {return_code} ---")
Alternative: Importing Modules and Running Functions
Instead of running separate files as processes, you can structure your code within functions and import the modules. This runs all code within the same Python process.
First, modify a.py
, b.py
, c.py
to wrap code in functions and prevent execution on import:
# a.py (Modified)
import time
def run_a():
print('Executing Function A...')
time.sleep(1)
print('Finished Function A.')
if __name__ == "__main__": # Only run if script is executed directly
run_a()
Apply similar changes to b.py
and c.py
, creating run_b()
and run_c()
Sequential Execution via Imports
# main_sequential_import.py
import a # Import the modules
import b
import c
print("Running functions sequentially...")
a.run_a()
b.run_b()
c.run_c()
print("Finished sequential functions.")
This is simple and efficient if the tasks don't need process isolation.
Concurrent Execution via Imports and Threading
Use the threading
module to run the imported functions concurrently within the main process.
# main_concurrent_import_thread.py
import a
import b
import c
import threading
print("Running functions concurrently via threads...")
# Create thread objects
thread_a = threading.Thread(target=a.run_a)
thread_b = threading.Thread(target=b.run_b)
thread_c = threading.Thread(target=c.run_c)
# Start threads
thread_a.start()
thread_b.start()
thread_c.start()
print("Waiting for threads to finish...")
# Wait for threads to complete
thread_a.join()
thread_b.join()
thread_c.join()
print("Finished concurrent functions.")
This provides concurrency within one process, which is different from the multi-processing approach of subprocess
. It's suitable for I/O-bound tasks but might be limited by the Global Interpreter Lock (GIL) for CPU-bound tasks.
Choosing the Right Method
subprocess.Popen
: Best for true concurrent process execution controlled from Python, especially if you need to manage the processes individually (check status, terminate, etc.).subprocess.run
: Best for sequential process execution controlled from Python, waiting for each to finish. Simpler thanPopen
for this case.- Command Line / Bash (
&
or sequential): Simple for basic automation outside of Python, less control. - Importing + Functions (Sequential): Use when process isolation isn't needed and you want simple sequential execution within one script.
- Importing + Threading: Use for concurrency within a single process, good for I/O-bound tasks where separate processes are overkill.
os.system
: Avoid in modern Python; usesubprocess
instead.
Conclusion
Python's subprocess
module (Popen
for concurrent, run
for sequential) provides the most robust and recommended way to execute external Python scripts from within another Python program.
- Command-line methods using
&
(concurrent) or simple sequencing offer quick alternatives for shell scripting. - Importing modules and running functions (possibly with
threading
) is suitable when true process isolation isn't required and you want to manage execution within a single Python instance.
Choose the method that best fits your need for concurrency, control, and integration.