Cheap and Secure Web Hosting Provider : See Now

# [Solved]: Running time of naive recursive implementation of unbounded knapsack problem

, ,
Problem Detail:

How does one go about analyzing the running time of a naive recursive solution to the unbounded knapsack problem? Lots of sources say the naive implementation is "exponential" without giving more detail.

For reference, here's a bit of Python code that implements the brute-force solution. Note that this can run for a long time even on smallish inputs. One of the interesting things about Knapsack is some inputs are lot harder than others.

``import random, time import sys  class Item:     def __init__(self, weight, value):         self.weight = weight         self.value = value      def __repr__(self):         return "Item(weight={}, value={})".format(self.weight, self.value)   def knapsack(capacity):     if capacity==0: return ([], 0)     max_value = 0     max_contents = []     for item in items:         if item.weight <= capacity:             (contents, value) = knapsack(capacity-item.weight)             if value + item.value > max_value:                 max_value = value + item.value                 max_contents = [item]                 max_contents.extend(contents)     return (max_contents, max_value)  def generate_items(n, max_weight, vwratio=1):     items = []     weights = random.sample(range(1,max_weight+1),n)     for weight in weights:         variation = weight/10         value = max(1, int(vwratio*weight + random.gauss(-variation, variation)))         item = Item(weight, value)         items.append(item)     return items  n=30 max_item_weight=100 capacity=100  items = generate_items(n=n, max_weight=max_item_weight, vwratio=1.1)  st = time.time() solution, value = knapsack(capacity) print("completed in %f"%(time.time() - st)) ``

Note that this algorithm can be improved upon nicely by memoizing yielding a O(nW) pseudo-polynomial time solution, but I was interested in understanding how to analyze the brute-force algorithm more precisely.

#### Answered By : gnasher729

You will be trying every sequence of items that fit, adding the same item multiple times if that is possible. So first you need to check if you actually wanted the same item to be used multiple times. Whether you did or not, you can make the algorithm a lot more efficient if you only add items in ascending order, that is if you added item k, then you only add further items k, k+1, k+2, ... or only items k+1, k+2, ... depending on whether you wanted to allow the same item multiple times.

You could also save substantial time again by sorting the items for example in descending order. If you do that, you can try adding items with the smallest first, and when an item doesn't fit, you know further items won't fit either - so you can stop the search there.

Anyway, the time depends entirely on the data. If all the items are in size between capacity/2 and capacity then it's \$O (n^2)\$ (could be \$O (n)\$ with improvements described earlier). If any item has weight 0, then your algorithm never ends; you should check that first. In the worst case where all n items have size 1 and the capacity is W, your algorithm takes \$n^W\$ steps - that's especially bad because that particular problem is trivial.

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/60452

3.2K people like this