Mutual Information and Channel Capacity Explanation
1. Channel Capacity and Mutual Information
Channel capacity C is the maximum rate at which information can be reliably transmitted over a channel.
From information theory:
C ≥ I(X; Y)
where:
I(X; Y)is the mutual information between channel inputXand outputY.Qis the input distribution.Wis the channel transition probability.
Mutual Information expressed via Entropy (Equation 9)
Mutual information between input and output can be expressed as:
C = I(X; Y) = h(Y) - h(Y | X)
where:
h(Y)is the entropy of the output.h(Y|X)is the conditional entropy of the output given the input.
2. Conditional Entropy for the Channel (Equation 10)
The conditional entropy of the output given the input (assuming Gaussian noise with variance ϲ) is:
ln 2Ļeϲ
h(Y|X) = -------------------
2
3. Lower Bound on Capacity Using Input Distribution Q
Capacity is the maximum mutual information over all input distributions Q that satisfy power constraints. Due to complexity, a lower bound is calculated using specific input distributions for three cases:
Case I: Both Average and Peak Power Constraints
Input parameter:
v = αP
Lower bound (Equation 14):
C(P, α, c, Ļ, Ļ, d) = ln
[ P(1 - e^(-μ*)) / (μ* sqrt(2Ļeϲ)) ]
- c d e^(-c d) A_r^(m+1) / (2 Ļ d²) cos^m(Ļ) cos(Ļ) + α μ* - f(α P)
Where μ* solves (Equation 15):
α = 1/μ* - e^(-μ*) / (1 - e^(-μ*))
Case II: Only Peak Power Constraint
Input parameter:
v = P / 2
Lower bound (Equation 16):
C(P, c, Ļ, Ļ, d) = ln
[ P / sqrt(2 Ļ e ϲ) ] e^(-c d) A_r^(m+1) / (2 Ļ d²) cos^m(Ļ) cos(Ļ) - f(P/2)
Case III: Only Average Power Constraint
Input parameter:
v = ε, ε ≪ 1
Lower bound (Equation 17):
C(ε, c, Ļ, Ļ, d) = ln
[ ε / sqrt(2 Ļ e ϲ) ] e^(-c d) A_r^(m+1) / (2 Ļ d²) cos^m(Ļ) cos(Ļ) - f(ε) + 1/2
4. Summary
- Capacity is the maximized mutual information over all valid input distributions.
- Conditional entropy is fixed by the noise characteristics.
- Output entropy depends on the input distribution and constraints.
- Closed-form lower bounds are derived for different power constraint cases.
5. Intuition
ln()terms relate to entropy formulas for Gaussian noise.- Exponential decay terms model channel attenuation.
cos^m(Ļ) cos(Ļ)terms represent geometric or directional gains.f(·)are correction/penalty functions related to input constraints.
6. Mutual Information as a Lower Bound on Capacity
- Mutual information measures how much input reduces output uncertainty.
- Capacity is the maximum achievable mutual information.
- Exact maximization is difficult, so practical lower bounds are used by choosing specific input distributions.