Cheap and Secure Web Hosting Provider : See Now

# Pick parameter function that minimises whole function

, ,
Problem Detail:

I have a recursive algorithm defined by the following recursion.

$$T(n) = T(n/f(n)) + O(\log f(n)).$$

I want to find the function $f$ that minimizes $T(n)$. If $f$ is a constant then $T(n) = \Theta(\log n)$. If $f = O(n)$ then $T(n) = \Theta(\log n)$. Is this true for all functions $f$ that $T(n) = \Omega(\log n)$ or can I get asymptotically better behaviour for some $f$? The reason I think there might be is that you can look at an extended binary search where you split your domain into $f(n)$ sections and then examine each section to see if your value is in that section. This recursion can be presented as follows:

$$T(n) = T(n/f(n)) + O(f(n)).$$

Binary search clearly runs in logarithmic time or worse for all values of $f$. This is the same as my recursion except that it is linear in $f$ instead of logarithmic. So you might expect better behaviour.

###### Answered By : Yuval Filmus

This is probably not possible. Let $n_1,\ldots,n_k$ be the sequence of $n$s encountered in the recurrence, starting with $n_1 = n$. Using the concavity of $\log$ (and eliminating the big $O$), we get that $$T(n) \geq k \log \frac{f(n_1) + \cdots + f(n_k)}{k}.$$ Assuming $f$ is monotone, we have $k \geq \log n_1/\log (n_1/n_2) = \log n/\log f(n)$. This monotonicity also shows that $n_{(k+1)/2} \geq \sqrt{n}$. Therefore $$T(n) \geq \log n \frac{\log f(\sqrt{n}) - 1}{\log f(n)}.$$ For all $f(n)$ I can think of, $\log f(\sqrt{n})/\log f(n) = \Theta(1)$, so they can't give you anything better than $\log n$. This also says that if you're looking for a monotone $f$ which beats $\log n$, then it must satisfy $\log f(\sqrt{n}) = o(\log f(n))$.