**Problem Detail:**

I have difficulties in understanding the notion of density for distribution.

**Notion of density for distribution**. A distribution $H$ over $\{0,1\}^n$ has density $\sigma$ if for every $x \in \{0,1\}^{n}$, $Pr[H=x] \leq \frac{1}{2^n\sigma}$.

The following is my desperate tries to understand the notion. if $H$ is uniform distribution over $\{0,1\}^n$ then for every $x \in \{0,1\}^n$, $Pr[H=x]=\frac{1}{2^n}$.

if a distribution $H$ has density $\sigma = 1$ then $P[H=x]\leq\frac{1}{2^n}$, so distribution $H$ is upper bounded by the uniform distribution.

if a distribution $H$ has density $\sigma = \frac{1}{2}$ then $P[H=x]\leq\frac{1}{2^{n-1}}$, so distribution $H$ is upper bounded by the uniform distribution over $\{0,1\}^{n-1}$.

if a a distribution $H$ has density $\sigma = \frac{1}{2^n}$ then $P[H=x]\leq 1$.

So density $\sigma$ might determine the part of distribution over $2^n$ where the actual "probabilistic weight" should be placed? However it's always upper bounded, therefore we can always say something about the upper bound?

As you see I don't have intuition behind this notion and I would appreciate any help.

###### Asked By : com

#### Answered By : frafl

This notion seems to be a measure of uniformity. If $\sigma=1$ you get a uniform distribution on $\{0,1\}^n$ (since the sum over all strings of length $n$ is $1$), but for other $\sigma$s (e.g. $\sigma = 2^{-k}$) it's still distribution on $\{0,1\}^n$, not on $\{0,1\}^{n-k}$ or similar. Thus saying that a distribution on another (disjoint) set is an upper bound is absolutely wrong (but they are related, s.b.).

If for some $x$ the probability is greater than $\frac{1}{2^n}$ it has to be smaller for others. On the other hand for $\sigma=\frac{1}{2}$ at least half of the strings have nonzero probability since the sum of any number of elements $< 2^{n-1}$ is smaller than $1$. This holds for any $\sigma > 0$, i.e. $\sigma 2^n$ elements have nonzero probability.

Using this you can *compare* the distribution for $\sigma=2^{-k}$ to the distribution for $\sigma=1$ (i.e. uniform) on $\{0,1\}^{n-k}$. It's easy to see that the entropy of the distribution with $\sigma=2^{-k}$ on $\{0,1\}^n$ (let $H$ be a random variable with this distribution) is at least the entropy of the uniform distribution on $\{0,1\}^{n-k}$ (RV denoted by $U$) since for every injection $f: \{0,1\}^{n-k}\rightarrow \{0,1\}^n:$ $$\mathbb{P}(U=x)\geq \mathbb{P}(H=f(x)) \quad .$$

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/11354

**3200 people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback