Big O notation is a mathematical notation used to describe the upper bound or worst-case scenario of the runtime complexity of an algorithm in terms of the input size.
In simple terms Big O notation helps us determine, "How code slows as data grows."
In Big O notation, the "O" refers to the order of growth of an algorithm's runtime. The notation is followed by a function that represents the upper bound of the algorithm's time complexity.
//Approach 1
const AddNumbers=(n)=>{
let result = 0;
for(let i=0; i<=n; i++) {
result += i;
}
return result;
}
//Approach 2
const AddNumbers = (n)=>{
let result = n * (n+1) / 2;
return result;
}
console.log(AddNumbers(10000));
In the above example, using Approach1 time of execution will increase as we'll increase the number but using approach 2 it'll always remain same. hence, we can say approach one is linear so Big O will be O(n) and approach 2 will remain constant in terms of exection time, so Big O will be O(1).