Fundamentals of Analysis of Algorithms

Why Analyze Algorithms?

To understand:
  • The cost: time & space
  • The trade-offs between time & space
  • Which algorithm is better?

Time?

Want to compare across time, hardware, & implementations

How to Analyze Algorithms?

  • Count "steps" not clock time
  • Asymptotic analysis: long-run performance i.e. how does the algorithm scale?
  • Worst case analysis

Asymptotic Analysis

Big-O

\( T(n) \approx O( cf(n) ) \)
if there is a positive constant \( c \) and \( n_0 \) such that \( T(n) \le f(n) \) when \( n \gt n_0 \)

Asymptotic Analysis

Algebra Rules for Big-O

If \( T_1(n) \approx O(f(n)) \) and \( T_2(n) \approx O(g(n)) \)

  1. \( T_1(n) + T_2(n) \approx O( f(n) + g(n) ) \)
  2. \( T_1(n) * T_2(n) \approx O( f(n) * g(n) ) \)
  3. If \( T(n) \) is a polynomial of degree \(k\), then \(T(n) \approx O(n^k) \)

Examples

  1. \( T(n) = 10 n + 100 \)
    then \( T(n) \approx O(n) \)
  2. \( T(n) = n^3 + 100n^2 + 38n +10000 \)
    then \( T(n) \approx O(n^3) \)
  3. \( T(n) = 2^n + 100 n^{100} + 10000n \)
    then \( T(n) \approx O(2^n) \)

Rules for Analyzing Code

  1. Basic operations are \( O(1) \)
  2. Add time for consecutive statements
  3. Multiply time for loops
  4. Take the max time for if-else statements
  5. For recursion use the Master Theorem

Example