Skip to main content

Mutual Information and Channel Capacity Explanation


Mutual Information and Channel Capacity Explanation

1. Channel Capacity and Mutual Information

Channel capacity C is the maximum rate at which information can be reliably transmitted over a channel.

From information theory:


C ≥ I(X; Y)
      

where:

  • I(X; Y) is the mutual information between channel input X and output Y.
  • Q is the input distribution.
  • W is the channel transition probability.

Mutual Information expressed via Entropy (Equation 9)

Mutual information between input and output can be expressed as:


C = I(X; Y) = h(Y) - h(Y | X)
      

where:

  • h(Y) is the entropy of the output.
  • h(Y|X) is the conditional entropy of the output given the input.

2. Conditional Entropy for the Channel (Equation 10)

The conditional entropy of the output given the input (assuming Gaussian noise with variance σ²) is:


               ln 2Ļ€eσ²
h(Y|X) = -------------------
                      2
      

3. Lower Bound on Capacity Using Input Distribution Q

Capacity is the maximum mutual information over all input distributions Q that satisfy power constraints. Due to complexity, a lower bound is calculated using specific input distributions for three cases:

Case I: Both Average and Peak Power Constraints

Input parameter:


v = αP
        

Lower bound (Equation 14):


C(P, α, c, φ, ψ, d) = ln
  [ P(1 - e^(-μ*)) / (μ* sqrt(2Ļ€eσ²)) ] 
  - c d e^(-c d) A_r^(m+1) / (2 Ļ€ d²) cos^m(φ) cos(ψ) + α μ* - f(α P)
        

Where μ* solves (Equation 15):


α = 1/μ* - e^(-μ*) / (1 - e^(-μ*))
        

Case II: Only Peak Power Constraint

Input parameter:


v = P / 2
        

Lower bound (Equation 16):


C(P, c, φ, ψ, d) = ln
  [ P / sqrt(2 Ļ€ e σ²) ] e^(-c d) A_r^(m+1) / (2 Ļ€ d²) cos^m(φ) cos(ψ) - f(P/2)
        

Case III: Only Average Power Constraint

Input parameter:


v = ε,  ε ≪ 1
        

Lower bound (Equation 17):


C(ε, c, φ, ψ, d) = ln
  [ ε / sqrt(2 Ļ€ e σ²) ] e^(-c d) A_r^(m+1) / (2 Ļ€ d²) cos^m(φ) cos(ψ) - f(ε) + 1/2
        

4. Summary

  • Capacity is the maximized mutual information over all valid input distributions.
  • Conditional entropy is fixed by the noise characteristics.
  • Output entropy depends on the input distribution and constraints.
  • Closed-form lower bounds are derived for different power constraint cases.

5. Intuition

  • ln() terms relate to entropy formulas for Gaussian noise.
  • Exponential decay terms model channel attenuation.
  • cos^m(φ) cos(ψ) terms represent geometric or directional gains.
  • f(·) are correction/penalty functions related to input constraints.

6. Mutual Information as a Lower Bound on Capacity

  • Mutual information measures how much input reduces output uncertainty.
  • Capacity is the maximum achievable mutual information.
  • Exact maximization is difficult, so practical lower bounds are used by choosing specific input distributions.

People are good at skipping over material they already know!

View Related Topics to







Contact Us

Name

Email *

Message *

Popular Posts

BER vs SNR for M-ary QAM, M-ary PSK, QPSK, BPSK, ...

šŸ“˜ Overview of BER and SNR 🧮 Online Simulator for BER calculation of m-ary QAM and m-ary PSK 🧮 MATLAB Code for BER calculation of M-ary QAM, M-ary PSK, QPSK, BPSK, ... šŸ“š Further Reading šŸ“‚ View Other Topics on M-ary QAM, M-ary PSK, QPSK ... 🧮 Online Simulator for Constellation Diagram of m-ary QAM 🧮 Online Simulator for Constellation Diagram of m-ary PSK 🧮 MATLAB Code for BER calculation of ASK, FSK, and PSK 🧮 MATLAB Code for BER calculation of Alamouti Scheme 🧮 Different approaches to calculate BER vs SNR What is Bit Error Rate (BER)? The abbreviation BER stands for Bit Error Rate, which indicates how many corrupted bits are received (after the demodulation process) compared to the total number of bits sent in a communication process. BER = (number of bits received in error) / (total number of tran...

Constellation Diagrams of ASK, PSK, and FSK

šŸ“˜ Overview of Energy per Bit (Eb / N0) 🧮 Online Simulator for constellation diagrams of ASK, FSK, and PSK 🧮 Theory behind Constellation Diagrams of ASK, FSK, and PSK 🧮 MATLAB Codes for Constellation Diagrams of ASK, FSK, and PSK šŸ“š Further Reading šŸ“‚ Other Topics on Constellation Diagrams of ASK, PSK, and FSK ... 🧮 Simulator for constellation diagrams of m-ary PSK 🧮 Simulator for constellation diagrams of m-ary QAM BASK (Binary ASK) Modulation: Transmits one of two signals: 0 or -√Eb, where Eb​ is the energy per bit. These signals represent binary 0 and 1.    BFSK (Binary FSK) Modulation: Transmits one of two signals: +√Eb​ ( On the y-axis, the phase shift of 90 degrees with respect to the x-axis, which is also termed phase offset ) or √Eb (on x-axis), where Eb​ is the energy per bit. These signals represent binary 0 and 1.  BPSK (Binary PSK) Modulation: Transmits one of two signals...

Online Simulator for ASK, FSK, and PSK

Try our new Digital Signal Processing Simulator!   Start Simulator for binary ASK Modulation Message Bits (e.g. 1,0,1,0) Carrier Frequency (Hz) Sampling Frequency (Hz) Run Simulation Simulator for binary FSK Modulation Input Bits (e.g. 1,0,1,0) Freq for '1' (Hz) Freq for '0' (Hz) Sampling Rate (Hz) Visualize FSK Signal Simulator for BPSK Modulation ...

Channel Impulse Response (CIR)

šŸ“˜ Overview & Theory šŸ“˜ How CIR Affects the Signal 🧮 Online Channel Impulse Response Simulator 🧮 MATLAB Codes šŸ“š Further Reading What is the Channel Impulse Response (CIR)? The Channel Impulse Response (CIR) is a concept primarily used in the field of telecommunications and signal processing. It provides information about how a communication channel responds to an impulse signal. It describes the behavior of a communication channel in response to an impulse signal. In signal processing, an impulse signal has zero amplitude at all other times and amplitude ∞ at time 0 for the signal. Using a Dirac Delta function, we can approximate this. Fig: Dirac Delta Function The result of this calculation is that all frequencies are responded to equally by Ī“(t) . This is crucial since we never know which frequenci...

Comparisons among ASK, PSK, and FSK | And the definitions of each

šŸ“˜ Comparisons among ASK, FSK, and PSK 🧮 Online Simulator for calculating Bandwidth of ASK, FSK, and PSK 🧮 MATLAB Code for BER vs. SNR Analysis of ASK, FSK, and PSK šŸ“š Further Reading šŸ“‚ View Other Topics on Comparisons among ASK, PSK, and FSK ... 🧮 Comparisons of Noise Sensitivity, Bandwidth, Complexity, etc. 🧮 MATLAB Code for Constellation Diagrams of ASK, FSK, and PSK 🧮 Online Simulator for ASK, FSK, and PSK Generation 🧮 Online Simulator for ASK, FSK, and PSK Constellation 🧮 Some Questions and Answers Modulation ASK, FSK & PSK Constellation MATLAB Simulink MATLAB Code Comparisons among ASK, PSK, and FSK    Comparisons among ASK, PSK, and FSK Comparison among ASK, FSK, and PSK Parameters ASK FSK PSK Variable Characteristics Amplitude Frequency ...

Power Spectral Density Calculation Using FFT in MATLAB

šŸ“˜ Overview 🧮 Steps to calculate the PSD of a signal 🧮 MATLAB Codes šŸ“š Further Reading Power spectral density (PSD) tells us how the power of a signal is distributed across different frequency components, whereas Fourier Magnitude gives you the amplitude (or strength) of each frequency component in the signal. Steps to calculate the PSD of a signal Firstly, calculate the first Fourier transform (FFT) of a signal Then, calculate the Fourier magnitude of the signal The power spectrum is the square of the Fourier magnitude To calculate power spectrum density (PSD), divide the power spectrum by the total number of samples and the frequency resolution. {Frequency resolution = (sampling frequency / total number of samples)} Sampling frequency (fs): The rate at which the continuous-time signal is sampled (in Hz). ...

Theoretical BER vs SNR for binary ASK, FSK, and PSK

šŸ“˜ Overview & Theory 🧮 MATLAB Codes šŸ“š Further Reading Theoretical BER vs SNR for Amplitude Shift Keying (ASK) The theoretical Bit Error Rate (BER) for binary ASK depends on how binary bits are mapped to signal amplitudes. For typical cases: If bits are mapped to 1 and -1, the BER is: BER = Q(√(2 × SNR)) If bits are mapped to 0 and 1, the BER becomes: BER = Q(√(SNR / 2)) Where: Q(x) is the Q-function: Q(x) = 0.5 × erfc(x / √2) SNR : Signal-to-Noise Ratio N₀ : Noise Power Spectral Density Understanding the Q-Function and BER for ASK Bit '0' transmits noise only Bit '1' transmits signal (1 + noise) Receiver decision threshold is 0.5 BER is given by: P b = Q(0.5 / σ) , where σ = √(N₀ / 2) Using SNR = (0.5)² / N₀, we get: BER = Q(√(SNR / 2)) Theoretical BER vs ...

Theoretical BER vs SNR for BPSK

Theoretical Bit Error Rate (BER) vs Signal-to-Noise Ratio (SNR) for BPSK in AWGN Channel Let’s simplify the explanation for the theoretical Bit Error Rate (BER) versus Signal-to-Noise Ratio (SNR) for Binary Phase Shift Keying (BPSK) in an Additive White Gaussian Noise (AWGN) channel. Key Points Fig. 1: Constellation Diagrams of BASK, BFSK, and BPSK [↗] BPSK Modulation Transmits one of two signals: +√Eb or −√Eb , where Eb is the energy per bit. These signals represent binary 0 and 1 . AWGN Channel The channel adds Gaussian noise with zero mean and variance N₀/2 (where N₀ is the noise power spectral density). Receiver Decision The receiver decides if the received signal is closer to +√Eb (for bit 0) or −√Eb (for bit 1) . Bit Error Rat...