Time Complexity
Time complexity measures how the running time of an algorithm grows as the input size grows. It's like asking: if I double the input, how much longer will the algorithm take? We use Big O notation to describe this growth in the worst case.
Common Time Complexities (from best to worst):
O(1)– Constant time. Fastest.O(log n)– Logarithmic time. Very fast (binary search).O(n)– Linear time. Scales directly with input (simple loop).O(n log n)– Linearithmic time (efficient sorts like merge sort).O(n²)– Quadratic time (nested loops, bubble sort).O(2^n)– Exponential time (recursive Fibonacci without memo).
Examples:
// O(1)
int first = arr[0];
// O(n)
for (int i = 0; i < n; i++) { ... }
// O(n²)
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) { ... }
}
Big O ignores constants and lower-order terms: O(2n) = O(n), O(n² + n) = O(n²).
Two Minute Drill
- Time complexity = how runtime grows with input size.
- Big O notation describes worst-case growth.
- Common complexities: O(1), O(log n), O(n), O(n log n), O(n²), O(2^n).
- Drop constants and lower-order terms when simplifying.
- Understanding complexity helps you write efficient code.
Need more clarification?
Drop us an email at career@quipoinfotech.com
