- Executing Time
- Growth Rate
- Big O Notation
- Best, Worst, and Average Cases
- Ignoring Multiplicative Constants
- Ignoring Non-Dominating Terms
- Examples: Determining Big-O
- Constant Time
- Comparing Common Growth Functions
- Practical Considerations

# Executing Time

Suppose two algorithms perform the same task such as search (linear search vs. binary search). Which one is better?

## Growth Rate

It is very difficult to compare algorithms by measuring their execution time. A theoretical approach was developed that compares two algorithms by examining their *growth rates*.

# Big O Notation

If an algorithm grows at a linear rate (such as the linear search), the growth rate has an order of magnitude of . Using Big O notation, the complexity of the linear search algorithm is , pronounced as “order of .”

## Best, Worst, and Average Cases

For the same input size, an algorithm’s execution time may vary, depending on the input.

So, to compare the efficiency of two algorithms, the analysis is generally conducted for the worst-case.

## Ignoring Multiplicative Constants

Algorithm analysis is focused on growth rate. The multiplicative constants have no impact on growth rates. The growth rate for or is the same as , i.e., .

## Ignoring Non-Dominating Terms

Algorithm analysis is for large input size. As grows larger, the part in the expression dominates the complexity. The Big O notation allows you to ignore the non-dominating part (e.g., in the expression ) and highlight the important part (e.g., in the expression ). So, the complexity of this algorithm is .

## Examples: Determining Big-O

**Simple Loops**

```
// Executed n times
for (i = 1; i <= n; i++) {
k = k + 5; // constant time
}
```

Time Complexity:

**Nested Loops**

```
// outer loop executed n times
for (i = 1; i <= n; i++) {
// inner loop executed n times
for (j = 1; j <= n; j++) {
k = k + i + j; // constant time
}
}
```

Time Complexity:

```
// executed n times
for (i = 1; i <= n; i++) {
// inner loop executed 20 times
for (j = 1; j <= 20; j++) {
k = k + i + j; // constant time
}
}
```

Time Complexity:

Add loops together

```
// executed 10 times
for (j = 1; j <= 10; j++) {
k = k + 4;
}
// executed n times
for (i = 1; i <= n; i++) {
// inner loop executed 20 times
for (j = 1; j <= 20; j++) {
k = k + i + j;
}
}
```

Time Complexity:

```
// check every item, O(n)
if (list.contains(e)) {
System.out.println(e);
}
else {
// n = list.size(). executed n times
for (Object t: list) {
System.out.println(t);
}
}
```

Time Complexity:

## Constant Time

The Big O notation estimates the execution time of an algorithm in relation to the input size. If the time is not related to the input size, the algorithm is said to take *constant time* with the notation .

## Comparing Common Growth Functions

- Constant time
- Logarithmic time
- Linear time
- Log-linear time
- Quadratic time
- Cubic time
- Exponential time

# Practical Considerations

The big O notation provides a good theoretical estimate of algorithm efficiency. However, two algorithms of the same time complexity are not necessarily equally efficient.