Quick Index
Videos


A quick video on page is here....

Introduction


Also see algorithms.

We can learn some of the core concepts of Big-O rather quickly. If we took a deep dive into Big-O we could study for a year.

Key Big-O symbols:

N-Number of data elements
O-Number of operations


N


Increasing
N = size of data (number of elements)

For Big-O, we want to think of "N" (or "n") as increasing towards infinity.


O


Definition


O = # of operations - The number of operations required for an algorithm to complete for a given N.

For Big-O, we determine how much O is increasing (as N is increasing).


Why Operations?


Why do we consider operations? Why not consider time? One reason is that run times vary because of external factors we cannot control such as processor speed, available cpu, available memory, etc. Another reason is that algorithms are typically hardware independent (which we would need to measure time).

Operations are predictable.

Big-O is Not About Numbers


With Big-O, we are not going to estimate that operations increase by 1.5%, or 25.39%. Rather, our outcome will be Big-O notation (symbols), as we'll soon see.

Assumptions & Limitations


Big-O Themes:


Assumptions:


Limitations:


O(1)


O vs N Example
If O (number of operations) is constant as N increases, we call it O(1).

O(1) indicates good algorithm performance (relative to increasing n).

Why not use O(3) here? Because the "1" in "O(1)" is just an indicator meaning "constant".
NO
13
1,0003
1,000,000,000,00003


O(n)


O vs N Example
If O (number of operations) grows linearly as N increases, we call it O(n).

O(n) has much poorer performance than O(1) relative to increasing N.

The "n" in "O(n)" is an indicator meaning "linear". The example shows a linear rate of "O = 1N" but any linear rate (2N, 3N, etc) is classified as O(n).
NO
100100
200200
300300


Example #1 - size


The DynamicArray's "size" method simply returns the "size" ivar (standard getter method).

Algorithm
We simply return the size ivar (one operation).
Code
Here is Java code for this algorithm
public int size() {
	return this.size;
}


O vs N
As N increases towards infinity, the algorithm operations is constant (just one operation "get ivar"). This is "O(1)".


Example #2 - removeFirst


The DynamicArray's "removeFirst" method removes the first element and returns it.

Algorithm
//"array" is convenience var
array = this.elements
//Get first
elem = get(0)
//ShiftAllLeft
Loop from 2 to (this.size - 1)
	array[i-1] = array[i]
//Set last to null
array[this.size - 1] = null
Return elem;



O vs N
We see that the number of operations (red arrows) is increasing linarly as N increases towards infinity. This is "O(n)".




Don't Sweat the Small Stuff
We can focus our Big-O analysis on just the loop part of the algorithm. Why?

Remember N gets very large.
  • Let N = 1,000,000,000.
  • Then O = 1,000,000,000 (from prev sketch)
  • We have four other ops (e.g. "get(0)", etc)
  • Total = 1,000,000,004
  • The four ops are not significant (can be ignored)
//"array" is convenience var
array = this.elements
//Get first
elem = get(0)
//ShiftAllLeft
Loop from 2 to (this.size - 1)
	array[i-1] = array[i]
//Set last to null
array[this.size - 1] = null
Return elem;

Example #3 - first


The DynamicArray's "first" method simply returns it's first element.

Algorithm
If (isEmpty())
	Throw runtime exception
Return this.elements[0];

Code
Here is Java code for this algorithm
public E first() {
	if (isEmpty())
		throwRuntime(emptyAccessMsg());
	return this.elements[0];
}


O vs N
As N increases towards infinity, the algorithm operations is constant (just one operation "get"). This is "O(1)".



Should that be O(2)?
Don't we have two or three operations (O) here rather than one?

Yes we do have two operations but remember we not interested in the actual O count, but rather the increase in O vs N. And we know that O is not increasing.

Also, remember N is large. Let's say 1,000,000,000,000. Would one or two operations be significant vs that N?
public E first() {
	if (isEmpty())
		throwRuntime(emptyAccessMsg());
	return this.elements[0];
}
What if [0] is slow?
What if array access is slow?

Two points on this question:

  • Big-O looks at operation increase (O) vs N. And O is constant (1 operation) for this algorithm (vs increasing N).
  • However, we surely want to keep our eyes open even if it's not in Big-O's domain. So such interest in "[]" is good. Read more about fixed array access here. It is fast.
public E first() {
	if (isEmpty())
		throwRuntime(emptyAccessMsg());
	return this.elements[0];
}

Example #4 - linearSearch


The DynamicArray's "linearSearch" method scans (iterates in order) the data to find the first match and returns the match index (or -1 if not found).

Algorithm
//Given object "z" to search for
Loop (scan) through the elements in order
	If an element matches "z"
		Return index
//None found
Return -1



O vs N
We see that the number of operations (red arrows) is increasing linarly as N increases towards infinity. This is "O(n)".




Couldn't We Match on First?
Big-O assumes "worst-case" (Murphy's law / pessimistic).

Which means we match on the last element.

See "Assumptions & Limitations"

References




Navigation