Cheap and Secure Web Hosting Provider : See Now

[Solved]: Conceptual question about entropy and information

, ,
Problem Detail:

Shannon's entropy measures the information content by means of probability. Is it the information content or the information that increases or decreases with entropy? Increase in entropy means that we are more uncertain about what will happen next.

1. What I would like to know is if entropy increases, does this mean that information increases?

2. If there are 2 signals, one is the desired and the other is the measurement signal. Let error be the difference between the two. Or error can be the estimation error in the context of weight learning.

What can we infer if the entropy of this error term decreases? Can we conclude that the error is reducing and the system is behaving close to the desired signal's behavior?

Shall be grateful for these clarifications

Answered By : Wandering Logic

Information = Entropy = Surprise = Uncertainty = How Much You Learn By Making an Observation. They all increase or decrease together. The entropy of a random variable $X$ is just another number summarizing some quality of that random variable. Just like the mean of a random variable is the expected value of $X$ or the variance of a random variable is the expected value of $(X - \mu)^2$, the entropy is just the expected value of some function, $f(X)$ of the random variable $X$. You find expectations of functions by using $\mathbb{E}[f(X)] = \sum_{x\in X}p(x)f(x).$ In this case the function of $X$ you care about is the log (base 2) of the probability mass function.

$$H(X) = \mathbb{E}[-\log_2 P(X)] = -\sum_{x \in X} p(x) \log_2 p(x).$$

This particular expectation is useful because it doesn't depend on the actual values that $X$ can take on, just the probabilities of those values. So you can use it to talk about situations where you aren't sending numbers, or where the numbers are just arbitrarily assigned to particular messages or symbols that you need to send.

What you want for your second question is the conditional entropy of the measurement random variable $X$ given the random variable $Y$ that represents what was sent. When there is no error the conditional entropy will be 0, when there is error the conditional entropy will be greater than 0.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/28756