Amdahl’s Law is a formula that helps to understand the potential speedup of a program when parts of it are parallelised. It’s often used in computer science to evaluate the efficiency of parallel computing, showing how much improvement you can expect from optimising a portion of a task.
## Formula of Amdahl’s Law
Amdahl’s Law states that the theoretical speedup $S$ of a program, given that a fraction $p$ of the program can be parallelised, is:
$
S = \frac{1}{(1 - p) + \frac{p}{n}}
$
where:
- $S$ = Speedup of the program when running on $n$ processors
- $p$ = Portion of the program that can be parallelised $(0 \leq p \leq 1)$
- $n$ = Number of processors
## Key Points
1. **Serial and Parallel Portions**:
- The term $1 - p$ represents the portion of the program that must be executed serially (cannot be parallelised).
- The term $\frac{p}{n}$ represents the parallel portion, spread across $n$ processors.
2. **Implications of Amdahl’s Law**:
- If a large part of the program is serial (i.e., $p$ is small), even increasing the number of processors won’t provide much speedup.
- As $n$ increases, the parallel part $\frac{p}{n}$ diminishes, so the impact of adding more processors decreases, leading to diminishing returns.
3. **Practical Limits**:
- When $p$ is close to 1 (most of the program can be parallelised), adding processors will significantly speed up the program.
- If there’s a large serial portion (large $1 - p$), the speedup will be limited, no matter how many processors are added.
## Example
Suppose 80% ($p = 0.8$) of a task can be parallelised. If you use 4 processors ($n = 4$), the speedup $S$ will be:
$
S = \frac{1}{(1 - 0.8) + \frac{0.8}{4}} = \frac{1}{0.2 + 0.2} = \frac{1}{0.4} = 2.5
$
In this example, the maximum achievable speedup is 2.5, even with 4 processors, because 20% of the task is still serial and unaffected by parallelism.