Fundamentals of Analysis of Algorithms
Why Analyze Algorithms?
To understand:
- The cost: time & space
- The trade-offs between time & space
- Which algorithm is better?
Time?
Want to compare across time, hardware, & implementations
How to Analyze Algorithms?
- Count "steps" not clock time
- Asymptotic analysis: long-run performance i.e. how does the algorithm scale?
- Worst case analysis
Asymptotic Analysis
Big-O
\( T(n) \approx O( cf(n) ) \)
if there is a positive constant \( c \) and \( n_0 \) such that \( T(n) \le f(n) \) when \( n \gt n_0 \)
Asymptotic Analysis
Algebra Rules for Big-O
If \( T_1(n) \approx O(f(n)) \) and \( T_2(n) \approx O(g(n)) \)
- \( T_1(n) + T_2(n) \approx O( f(n) + g(n) ) \)
- \( T_1(n) * T_2(n) \approx O( f(n) * g(n) ) \)
- If \( T(n) \) is a polynomial of degree \(k\), then \(T(n) \approx O(n^k) \)
Examples
- \( T(n) = 10 n + 100 \)
then \( T(n) \approx O(n) \)
- \( T(n) = n^3 + 100n^2 + 38n +10000 \)
then \( T(n) \approx O(n^3) \)
- \( T(n) = 2^n + 100 n^{100} + 10000n \)
then \( T(n) \approx O(2^n) \)
Rules for Analyzing Code
- Basic operations are \( O(1) \)
- Add time for consecutive statements
- Multiply time for loops
- Take the max time for if-else statements
- For recursion use the Master Theorem
Example