**Problem Detail:**

## QUESTION

**does CPU time reflect Big O time complexity?**

### WHAT I MADE

I implemented two Fibonacci functions in C programming language. The former has recursive behaviour while the latter has iterative behaviour.

I found out that the worst case scenario for the recursive version is exponential time (i.e. $2^n$) and linear time for the iterative one - since I have a simple loop within.

I then computed the CPU time for both the version using the `time.h`

C library.

### WHAT I EXPECT

I expect those CPU values to be somehow related with that time complexity. Generally speaking, I expect way larger CPU times for the recursive version.

### WHAT I GOT

I got the foregoing plot using MATLAB (that is a loglog plot of cpu times as function of $n$).

Seems like the CPU time for the recursive function has an exponential behaviour - that is what I was expecting. Though, I'm not completely sure about CPU time for the iterative version since that function does not look like this one (section log-log graphs).

I'd like to answer myself without doubts but I'm quite newbie.

###### Asked By : Mattia Paterna

#### Answered By : D.W.

There are really two questions here:

How does CPU time relate to time complexity?

Can you use a plot of CPU time to understand time complexity?

Let me answer each in turn.

## Time complexity

Time complexity refers to the *total time* it takes for the algorithm to finish.

If by "CPU time" you just mean the time it took for this particular program to run, excluding the time for other programs or the operating system, then yes, these are the same.

If by "CPU time" you meant the time spent on computation but not time spent on I/O, then it might or might not match what is meant by time complexity. In practice, I/O is often slower than computation -- maybe by a significant factor. Normally, in theoretical work where we measure time complexity, we don't try to account for that distinction, as it's "only a constant factor".

## The plot heuristic

Plotting the running time as a function of the size of the input can definitely be a useful *heuristic* to form an *informed guess* about what the asymptotic time complexity of the an algorithm is. It's often a good guide. However, be warned that it is not 100% reliable: there are some cases where it can be misleading.

I'd like to refer you to the following three questions for more on this topic:

## Bottom line

In summary: yes, basically, plotting CPU time as a function of input size can be a helpful way to assess asymptotic time complexity, as long as you keep in mind the caveats.

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/66815

**3200 people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback