Cheap and Secure Web Hosting Provider : See Now

How long will selection sort and merge sort take to sort a certain number of items?

, , No Comments
Problem Detail: 

I am dealing with a sample exam question that I cannot understand which is as follows:

Selection sort takes one millisecond to sort 1000 items (worst-case time) on a particular computer. Estimate the amount of time it would take to sort 100,000 items. If Mergesort took 1 second for 100 items and 15 seconds for 1000 items, how long would you expect it to take for 1,000,000?

Now, I know that selection sort has worst-case upper bound of $O(n^2)$. So as size of input $n$ increases, $f(n)$ increases in a quadratic manner. So I understand the concept of it but I don't understand how to use it to solve for how much time it would take for 100,000 items (100 times the original input). How would I solve it?

Asked By : learnerX
Answered By : Raphael

They want you to apply rule of three under ridiculous assumptions.

If an O(n²) algorithm takes 1ms for 1000 elements, then from 1ms = c * 1000² and the ansatz Xms = c * 100000² we get that X = 10000.

In essence, they want you to assume that the running time function equals $cn^2$, in which case you could compute the constant $c$ from the given information.

The problem is that

  • $O(\_)$ is only an upper bound and does not tell you anything about the true behaviour of the described function,
  • there are unknown lower-order terms hidden in that $O(\_)$ (and also $\Theta(n^2)$, if you had that) and
  • asymptotics tell us nothing for any finite $n$.
  • They also ignore that the algorithms take different amounts of time for different inputs of the same size and
  • that "time" is a famously fickly cost measure as it depends a lot on effects like caching and process scheduling.

Therefore, the problem is senseless.

Best Answer from StackOverflow

Question Source :

3200 people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback