Cheap and Secure Web Hosting Provider : See Now

[Solved]: Proving that f(N) = N lg N + O(N) implies f(N) = Θ(NlogN)

, , No Comments
Problem Detail: 

How do you prove the following?

$$f(N)=N\space lg\space N + O(N) \implies f(N) = \Theta(N \space log \space N)$$

Here $lg\space N \equiv log_2\space N$, and $O$ stands for the big-O, and $\Theta$ stands for big-theta. And $log \space N \equiv log_e \space N$, i.e. natural logarithm

My reasoning may not be all that mathematically pleasing (or even correct)...but here goes...

We have $f(N)=N\space lg\space N + O(N)$

By definition, $O(f(N))$ stands for the some upper bound for the function $f(N)$.

Therefore, in our relation $O(N)$ (being an upper bound function) should theoretically contain $f(N)$ as $N \rightarrow \infty$.

Now consider the lower bound of $f(N)$ to be some function of $N \space lg \space N$, i.e. $\Omega(f(N)) \equiv f(N \space lg \space N)$

Now, $lg \space N \equiv log_2 \space N > log \space N$, where $log$ stands for natural logarithm. Hence natural logarithm is some constant factor of "binary" logarithm, where that constant is less than 1.

So all the above reasoning naturally implies our above assertion, right? Where am I wrong?

Asked By : deostroll

Answered By : Luke Mathieson

Your third line (of the proof) doesn't seem to make sense, are you trying to say $f(N) \in O(N)$? If so, that's not true (you're trying to prove that it's not true!). Lines 4 and 5 are not clear, but if you were to write that in an exam or paper, you'd get in trouble.

As usual with asymptotic analysis, we just need to go back to the definitions. $f(N) \in \Theta(N\log N)$ means there exists constants $N_{0}$, $c_{1}$ and $c_{2}$ such that $c_{1}\times Nlog N \leq f(N) \leq c_{2}\times N\log N$ for all $N \geq n_{0}$.

There's a method involving limits that works, but I'll do it the old fashioned way first:

Lower bound ($f(N) \in \Omega(N\log N)$):

As $f(N) = N\log N + O(N)$ (and because we're dealing with CS so we make mathematical unjustified assumptions like everything is non-negative) we have $f(N) \geq N \log N$, so we can pick $c_{1} = 1$ and any $n_{0} \geq 0$ and we have $f(N) \in \Omega(N\log N)$.

Upper bound ($f(N) \in O(N\log N)$):

By the definition of $O(\cdot)$, we know that $f(N) \leq N \log N + b\times N$ for all $N \geq m_{0}$, for some constants $b$ and $m_{0}$. We can assume that we're using base 2 logarithms and that $N > 2$, so we know then that $\log N \geq 1$. Therefore $f(N) \leq N\log N + b\times N \leq N\log N + b\times N \log N$.

Thus we can pick $c_{2} = b+1$ and andy $n_{0} \geq \max\{m_{0},2\}$ and we get $f(N) \in O(N\log N)$.

Putting it together: So we have $f(N) \in O(N\log N)$ and $f(N) \in \Omega(N\log N)$, which gives $f(N) \in \Theta(N\log N)$ where $c_{1} = 1$, $c_{2} = b+1$ and $n_{0} = \max\{m_{0},2,0\}$ (obviously the $0$ is extraneous).

The Limit Approach:

We can also take the limit of the ratio of two functions to determine some asymptotic relationships. In particular for us, if $\lim_{n\rightarrow \infty}\frac{f(n)}{g(n)} = k$ for some constant $k$, then $f(n) \in \Theta(g(n))$.

So for our two funtion, we take $\lim_{n\rightarrow \infty}\frac{f(N)}{N\log N}$ apply l'Hopital's rule a couple of times, and conclude that $f(N) \in \Theta(N\log N)$.

Best Answer from StackOverflow

Question Source :

3.2K people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback