Time and Space Complexity

Time Complexity

The running time of an application depends on multiple factors:

  • Single vs Multiple processes

  • Read and write speed

  • 32-bit vs 64-bit architecture

  • Configuration of the machine

  • Input (How our program behaves with different inputs)

Time complexity is the measure of how the time taken by a program grows with the increase in the number of inputs. We assume that all the other factors remain constant, and we want to calculate how is the speed of the program for input, and how it changes with the size of the input.

Hint: When time complexity is asked, we need to answer from the perspective of input

Hint: Time complexity is calculated based on the number of instructions

def get_sum(a,b):
    return a+b
    
# The function has time complexity of two units, since two instructions
# 1. addition 2. return statement

# Hence time complexity of function get_sum is 2 units that is O(2)
# O(2) is read as order of 2 (which is constant time complexity)
## Another example, to understand the time complexity

#operations on a list to find the sum of all the elements
# assume arr has n elements

def find_sum(arr):
    total = 0 # first operation
    
    # two operations per iteration, assignment (to i) and comparison
    # n+1 iterations for this operation
    for i in arr: 
        total += i 
        # two operations per iteration, assignment (=) and maths (+)
        # n iterations required
    return total # 1 operation
    
# verify that number of operations = 1 + 2(n+1) + 2n + 1 = 4n+4
# Hence O(4n+4), time complexity is function of n (number of elements in arr)

Time Complexity Notations

The order or time complexity is calculated through asymptotic assumption. The number of inputs is assumed to be n, and we want to see the upper and lower bound order for the time complexity.

  • Big O notation:

    • The notation Ο(n) is the formal way to express the upper bound of an algorithm's running time

    • It measures the worst-case time complexity or the longest amount of time an algorithm can possibly take to complete.

  • Omega notation:

    • The notation Ī©(n) is the formal way to express the lower bound of an algorithm's running time

    • It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.

  • Theta notation:

    • The notation Īø(n) is the formal way to express both the lower bound and the upper bound of an algorithm's running time

Time complexity Rules

In general, we want to calculate time complexity in two cases:

  1. Very large input size (order of time complexity based on asymptotic input assumption)

  2. Worst case scenario (When inputs are in a particular order, for example sorting algorithm)

Few rules while calculating time complexity

  • Time complexity is given based on the assumption that when the number of inputs tends to infinity (asymptotic, n -> infinity), the order which has the highest contribution defines the order of time complexity. For example, 4n4+10n3+2n2+14n4+10n3+2n2+1 , it is n4.

  • Overall time complexity = sum of (time complexity of individual fragments)

  • For conditional statement, the time complexity is given based on the worst-case scenario

Common asymptotic notations

constant

āˆ’

Ο(1)

logarithmic

āˆ’

Ο(log n)

linear

āˆ’

Ο(n)

n log n

āˆ’

Ο(n log n)

quadratic

āˆ’

Ο(n2)

cubic

āˆ’

Ο(n3)

polynomial

āˆ’

nΟ(1)

exponential

āˆ’

2Ο(n)

Speeding-up API

  • In micro-service based architecture, we generally try to breakdown every application into small independent modules and create end-points for each micro-service

  • While speeding up the applications, most of the time, problem does not lie with the iterations.

  • The optimisation might improve the time by few 0.5 sec or so.

  • Most number of api calls and connections we are making within the micro-service is the bottleneck

Space Complexity

  • Space complexity analysis is also almost the same as time complexity analysis

  • Here, we want to see how efficient a code is in terms of memory usage

  • Additional space/memory is measured in terms of the largest memory use by the program when it runs. That is to say, if you allocated O(N) memory, and later free it, that does not make the space complexity of your program O(1)

Optimising RAM Space

  • In general, we don't need to interact with garbage collection for managing the space

  • Most of the time, declaring few extra objects does not impact the RAM usage as such

  • However, when multiple instances of a big package is created in different files within an application significantly affects the RAM usage.

  • Declaring an instance of the big-module globally and using it wherever required within the application significantly helps in lowering down the RAM usage

Recursive Function: Time and Space Complexity

Reference for Further Reading

Last updated

Was this helpful?