Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

What is time complexity of halving a matrix while doing an O(n^2) operation on it?

Suppose you have two square matrices, m1 and m2, of size n X n. The algorithm iteratively runs an O(n^2) operation on them while halving the matrices at each iteration. Something like the pseudo-code below:

while m1.width > 0:
    u = m1 & m2  # pairwise logical and operation (O(n^2))
    m1 = halve(m1)
    m2 = halve(m2)

Since the loop is repeated log(n) times, O(n^2.log(n)) is an upper bound for the algorithm’s time complexity. I am looking for a tighter upper bound. Any ideas?

I expect a tighter upper bound based on the premise that the size of the matrices for which O(n^2) is estimated is logarithmically shrinking.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

With a loop like this, it’s often a good idea to add up the work done in each iteration rather than to multiply the work done by one iteration by the number of iterations. After all, the work done per iteration is changing.

Leaving out the O notation for a moment, your first iteration does roughly n^2 work, both to compute the AND and to halve the matrix size. Now, how much work does the second iteration do? Well, since your matrices are half as big as before in each dimension, the work done to AND them is roughly (n / 2)^2 = n^2 / 4. When you halve the matrices again, the next AND does (n / 4)^2 = n^2 / 16 work. More generally, the kth time you go through the loop, your AND operation will do n^2 / 4^k work.

Adding this up across all iterations of the loop gives us that the total work done is

n^2 / 4^0 + n^2 / 4^1 + n^2 / 4^2 + …

= n^2 · (1 / 4^0 + 1/4^1 + 1/4^2 + …)

That latter bit is the sum of a geometric series, and it adds up to 4/3. So that gives us that the total work done is at my 4n^2 / 3, which is O(n^2).

Another way to see this: while your code is written iteratively, you could imagine thinking of it as a recursive algorithm that does O(n^2) work, then runs again on an input of size n / 2. That gives the recurrence

T(n) = T(n / 2) + O(n^2).

The Master Theorem then tells you that (with a = 1, b = 2, d = 2, and log_b a < d) that the solution is O(n^2).

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading