**Time and Space complexity**

**Space Complexity**

Space Complexity is defined as the process of determining a formula for prediction of how much memory space will be required for the successful execution of the algorithm. in other words, the total space required by the algorithm for its working, for various input sizes is known as space complexity.

for any algorithm, memory is required for the following purpose

- to store program instructions.
- to store constant values.
- to store variable values.
- And a few other things like function calls, jumping statements, etc.

**Auxiliary space**

Auxiliary space is the temporary space allocated by your algorithm to solve the problem, with respect to the input size.

space complexity includes both Auxiliary space and space used by input.

for example-

`int fact=1; //fact=4 byte`

for(int i=1;i<=n;++i) //n=4 byte

{

fact *=i; //i=4 byte

}

return fact; //aux space =4 byte

//space=16 byte

**Time Complexity**

space complexity is defined as the process of determining a formula for the total time required towards the execution of that algorithm. the calculation will be independent of implementation details and programming language. in other words number of operations, an algorithm performs to complete its task with respect to the input size.

time complexity is a computational way to show how (behavior)runtime of a program increases as the size of its input increases.

`int count = 0;`

for (int i = 0; i < N; i++)

for (int j = 0; j < i; j++)

count++;

The total number of times count++ will run is 0+1+2+……+(N−1)=N∗(N−1)2. So the time complexity will be O(N2).

**Asymptotic notations**

They are used to make meaningful statements about the efficiency of algorithms. these notations help us to make approximate but meaningful assumptions about the time and space complexity.

**Big-Oh Notation (O)(Upper Bound)**

the notation is the formal way to express the upper bound (worst case) of an algorithm’s running time. it measures the worst-case time complexity or the longest amount of time algorithms can possibly take to complete.

some common asymptotic notations-

>>constant time O(1)

>> linear O(n)

>>logarithmic O(log n)

>>quadratic O(n²)

>>cubic O(n³)

rules to find big O b=notations-

- find out the growing variable.
- eliminate the constant terms.

**Big-Omega Notation(Ω) (Lower bound)**

big omega notation specifically describes the best-case scenario. it represents the lower bound running time complexity of an algorithm. basically, it tells you what is the fastest time in which algorithms can run.

**Big-Theta Notation (ϑ)(Tight bound)**

big theta notation specifically describes the average-case scenario. it represents the most realistic time complexity of an algorithm.