Pages

Quantities of Information : Entropy II


2. Joint Entropy: The joint entropy of two discrete random variables X and Y is merely the entropy of their pairing: (X; Y ). This implies that if X and Y are independent, then their joint entropy is the sum of their individual entropies.
For example, if (X; Y ) represents the position of a chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
 Despite similar notation, joint entropy should not be confused with cross entropy.


3. Conditional entropy: The conditional entropy or conditional uncertainty of X given random variable Y (also called the equivocation  of X about Y ) is the average conditional entropy over Y
Because entropy can be conditioned on a random variable or on that random variable is a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
   H(X|Y ) = H(X; Y ) - H(Y )


No comments:

Post a Comment

If you have any doubt, let me know

Email Subscription

Enter your email address:

Delivered by FeedBurner

INSTAGRAM FEED