Search Search Any Topic from Any Website Search
Mutual Information and Channel Capacity Explanation 1. Channel Capacity and Mutual Information Channel capacity C is the maximum rate at which information can be reliably transmitted over a channel. From information theory: C ≥ I(X; Y) where: I(X; Y) is the mutual information between channel input X and output Y . Q is the input distribution. W is the channel transition probability. Mutual Information expressed via Entropy (Equation 9) Mutual information between input and output can be expressed as: C = I(X; Y) = h(Y) - h(Y | X) where: h(Y) is the entropy of the output. h(Y|X) is the conditional entropy of the output given the input. 2. Conditional Entropy for the Channel (Equation 10) The conditional entropy of the output given the input (assuming Gaussian noise with ...