Cheap and Secure Web Hosting Provider : See Now

# How to interpret these asymptotic runtime bounds for discrete logarithm algorithms?

, ,
Problem Detail:

I am trying to compare asymptotic runtime bounds of a few algorithms presented in this research paper, A quasi-polynomial algorithm for discrete logarithm in finite fields of small characteristic. The functions are,

$L(\alpha) = \exp(O(\log(n)^{a}(\log\log(n))^{1-\alpha}))$,

which I'd like to plot for $\alpha = 1/3, 1/4+O(1)$, and $1/4 + O(n)$,

and $O(n^{\log n})$

Where n is the bit-size of the input. ... Such a complexity is smaller than any $L(\epsilon)$ for any $\epsilon > 0$

I would like to plot each of these functions together to compare their growth. The problem is that the second function grows much faster than the first, which implies I am misinterpreting something.

So what is the correct way to interpret and compare these functions?

###### Answered By : Yuval Filmus

The confusion is that in the expression for $L(\alpha)$, $n$ is size of the field, whereas in $n^{O(\log n)}$, $n$ is the size of the input. The correct comparison is between $L(\alpha) = \exp O((\log n)^\alpha (\log\log n)^{1-\alpha})$ and $(\log n)^{O(\log\log n)} = \exp O((\log\log n)^2)$, which is indeed much smaller.

Note that your suggestion to plot $L(\alpha)$ for $\alpha = 1/4 + O(1)$ and $\alpha = 1/4 + O(n)$ doesn't make much sense; you should think of $\alpha$ as a constant. It could make sense to consider $\alpha = 1/4 + o(1)$ (or $\alpha = 1/4 + \epsilon$, where $\epsilon$ is understood to be "small"), though it's not clear how exactly you'd plot it. Indeed, there is no reason to plot, and a plot wouldn't make too much sense without knowing the hidden big O constant. In practice you just do experiments.