Skip to main content

Posts

Showing posts with the label asymptotic notation

Asymptotic or "Big O" Notation for Understanding Algorithm Performance

What does it mean for an algorithm to be "fast" or "efficient"? When we say an algorithm is fast, we are not referring to an explicitly notation in time, like seconds or minutes. There is a distinction here because of the distinction between hardware and software - performance of a program in real time can be drastically impacted by the hardware it is running on. In order to account for this distinction, while still providing an objective measurement for the efficiency of a program, computer scientists rely on Asymptotic Measurement and a notation format called Big O Notation to describe such measurements. The formal description as a formula is as follows: f(x) ∈ g(x) x o C f(x) ≤ C • g(x) x > x o In less theoretical terms; as the size of your input increases toward infinity, how does the runtime of your program grow? For example, imagine counting the number of characters in a string in the most simple way, by reading each character and summi