
Advanced Information Theory and Channel Coding Concepts
Explore differential entropy, mutual information, channel capacity, and more in the field of Information Theory and Coding. Learn about maximizing entropy in practical systems and capacity of band-limited channels with AWGN. Dive into the intricacies of channel coding theorem and related topics.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Subject Name: Information Theory Coding Subject Code: 10EC55 Prepared By: Shima Ramesh,Pavana .H Department: ECE Date: 10/11/2014 3/18/2025 MVJCE
UNIT 4 Channel coding Theorem 3/18/2025 MVJCE
Topics to be covered Channel coding theorem Differential entropy Mtual information for continuous ensembles Channel capacity theorem 3/18/2025
Differential entropy: Entropy H(X) of a continuous source can be defined as Where f(x) is the probability density function (p.d.f) of continuous random variable X. Consider an example : suppose X is a uniform random variable over the interval (0,2) hence 0 else where Hence using the equation of entropy we can write 3/18/2025
Maximization of entropy: In practical systems, the sources for example radio,transmitters,are constrained to either average power or peak power limitations. Objective then is to maximize the entropy under such restrictions. The general constraints may be listed below: 5) Average power limitation, with unidirectional distribution. 3/18/2025
Mutual information of a continuous channel: Consider The mutual information isThe mutual information is The channel capacity is given by 3/18/2025
Amount of mutual information: Since the channel capacity is defined as Then it follows 3/18/2025
Capacity of band limited channels with AWGN and Average power: Topic details Limitation of signals: the shannon hartley law: The received signal will be composed of transmitted signal X and noise n Joint entropy at the transmitter end assuming signal and noise are independent 3/18/2025
Joint entropy in receiver end Is given by Since the received signal is Y=X+n From above two equations channel capacity in bits/second is 3/18/2025
If the additive noise is white and gaussian and has a power of N ,in bandwidth B hz then we have Further if the input signal is also limited power S over a bandwidthX,n are independent then it follows: For a given mean square value, the entropy will become a maximum .If the signal is gaussian and there fore entropy of the output is 3/18/2025
If n is an AWGN ,Y will be gaussian if and only if Xi also gaussian this implies Using all above equations .1 Equation 1 is called Shannon-Hartley law. 3/18/2025
Primary significance of shannon-hartley law is that it is possible to transmit over a channel of bandwidth B Hz perturbed by AWGN at a rate of C bits/sec with an arbitrarily small prob of error if the signal is encoded in such a manner that the smaples are all gaussian signals. Bandwidth SNR trade off 3/18/2025
From the figure we find the noise power over (- B,B) as or That is, the noise power is directly proportional to to band width B. Thus the noise power will be reduced by reducing the band width and vice versa. 3/18/2025
For wide band system where (S/N)>>1 Using Leads to The above equation predicts an exponential improvement in (S/N) Ratio with the band width for an ideal system. 3/18/2025
Capacity of a channel of infinite bandwidth.: Shannon- Hartley formula predicts that a noiseless gaussian channel with (S/N)=infinity has an infinite capacity. However the channel capacity does not become infinite when the bandwidth is made infinite 3/18/2025
Accordingly Places an upper limit on channel capacity with increasing Band width. 3/18/2025
Band width efficiency: Shannon Limit.The average transmitted power is expressed as From the shannon formula becomes From which one can show (C/B) Is called bandwidth efficiency of the system. 3/18/2025
If C/B=1 then it follows that Eb=No. This implies that the signal power equals the noise power. Suppose B=Bo for which S=N ,then That is the the maximum signaling rate for a given S is 1.443 bits/sec/Hz In the bandwidth over which the signal power can be spread without its falling Below the noise level. 3/18/2025
Topic details Is known as shannon s limit for transmission capacity And the communication fail other wise 3/18/2025
Bandwidth Efficiency Diagram We define an ideal system as one that transmits data at a bit rate R equal to the channel capacity As a function of the energy per bit to Noise power spectral density 3/18/2025