calculate time complexity of two functions

What is the difference between time complexity of O(n) and O(Log n). Let us say I have a function which has time complexity of O(Log n) and space complexity of n.

What will be the time complexity if I have to calculate addition two functions with time complexity of

  • Log n and Log n
  • Log n and n
  • n and n

>Solution :

Take the highest "power" of n

  • Log n and Log n -> Log n
  • Log n and n -> n
  • n and n -> n

Don’t bother with the constant term. We are only interested in scaling behaviours, i.e.

What happens to the overall space or time, when we have n times more?

If the space or time required goes up by the same factor, it is O(n).

If it goes upby a constant additive step for every scale factor of input size, we say it is O(log N). For example, for sizes of 1, 10, 100, the times may be 4 sec, 4.5 sec, 5 sec. Then you are adding 0.5 seconds for every 10-fold increase in data.

If it goes up by the square of that factor, it is O(n^2).

It is not exactly like calculating logarithms

You raise an excellent question:

Will it not be Log n * Log n -> log^2(n) = log(n)×log(n) = log(n)^2?

I am looking at this answer quora.com/Difference-between-log-2-n-log-log-n-and-log-n-2 qr.ae/pr3wPd

That is the wrong arithmetic.

You were asking about a process that involved two subprocesses that have to happen one-after-another. So the time taken is the addition, not the multiplication.

What you want to calculate is

log N + log N =  2 log N

But for O notation, we disregard the 2, and just write the result as O(log N).

When you have one loop calling another, you multiply the things in the O()

As pointed out by @Support Ukraine in the comments:

If an algorithm which is O(log n) calls (in its innermost loop) another algorithm which is also O(log n), then the resulting complexity is O((log n) ^2).

When you have two additive steps (i.e. not multiplicative), you take the highest "power" of n

If your process involves doing an O(N) thing, and then once that has finished, doing an O(log N) thing, then the overall result is O(N).

This is because once N gets large, the N will grow much faster than log N.

Quick summary: think of it as "proportional to"

When you say something is proportional to something else, you don’t really care about any constant terms.

Suppose the amount people eat is proportional to their weight. You can say that, without specifying how many meals, days, or years we are talking about. The duration will affect the constant of proportionality but not the concept that "cost of food eaten" is proportional to "body weight".

For example, if you switch between currencies, or add taxes, the constant of proportionality changes, but the proportionality remains.

Leave a Reply