Cheap and Secure Web Hosting Provider : See Now

[Answers] sorting a sorted array that has been inc/dec by random numbers

, ,
Problem Detail:

If given a sorted array of n distinct positive integers. And each element is incremented or decremented by a number between 0 and X. For positive integer X that is a function of n.

Formulate an algorithm that takes the inc/dec array and sort it. Make the run time O(n) when X is constant.

I am unsure with my solution. Firstly, I just choose an random number of buckets... 10. The proof seemed to work out. Is there a way to prove that using X(the variable from above) number of buckets will be better?

In addition, is there a way to utilize the fact the array has been sorted prior to being augmented?

My solution:

``1.  Find the minimum and maximum of the array by iteratively going through the array  O(n)  2.  Create intervals for the k buckets (k = number of buckets) int interval int k = 10 interval = (Max – Min + 1)/k  3.  Assign each element to the bucket based on the interval O(n)  4.  Sort each bucket if there is ni elements in each ith bucket the sorting using randomized quicksort S(ni) = O(nilogni)  5.  Combine all the elements O(n) Runtime Analysis: •   The time to sort every bucket will be as follows: •   We assume the elements are uniformly distributed. Thus n/k elements in each bucket. ``

Since k = n/10, T(n) = n log(10) = O(n)

Your algorithm doesn't work since you're analyzing it under some (suspect) random model, whereas the problem setters wanted a worst case guarantee. The worst case for you is that everything falls into the same bucket, and then you just get the usual \$O(n\log n)\$ guarantee.

Instead of your approach, I suggest trying to use the following observation:

After sorting, each value moves at most \$2X\$ positions.

As an example of why this is useful, note that you can find the minimum of the array in time \$O(X)\$ by looking only at the first \$2X\$ elements.