Time & Space Complexity
Time Complexity
Time complexity refers to the amount of time taken by an algorithm to complete its execution as a function of its input size. It is commonly expressed in terms of the "big O" notation, which provides an upper bound on the worst-case running time of an algorithm. The following table provides an overview of common time complexities and their corresponding "big O" notations:
Time Complexity | Notation | Example |
---|---|---|
Constant | O(1) | Accessing a specific element in an array |
Logarithmic | O(log n) | Binary search |
Linear | O(n) | Finding the maximum element in an array |
Linearithmic | O(n log n) | Merge sort |
Quadratic | O(n^2) | Bubble sort |
Exponential | O(2^n) | Recursive Fibonacci |
Space Complexity
Space complexity refers to the amount of memory required by an algorithm to complete its execution as a function of its input size. It is commonly expressed in terms of the "big O" notation, which provides an upper bound on the worst-case space usage of an algorithm. The following table provides an overview of common space complexities and their corresponding "big O" notations:
Space Complexity | Notation | Example |
---|---|---|
Constant | O(1) | Allocating a fixed number of variables |
Linear | O(n) | Allocating an array of size n |
Quadratic | O(n^2) | Allocating a 2D array of size n x n |
Conclusion
Understanding time and space complexity is important for designing efficient algorithms. By analyzing the time and space requirements of an algorithm, software developers can choose the most appropriate algorithm for a given problem and optimize it for better performance. The "big O" notation provides a standardized way of expressing time and space complexity and helps in comparing algorithms with different input sizes.
Last updated