Digital Communication & Signal Processing Seminar Insights

seminar 2 digital communication and signal n.w
1 / 15
Embed
Share

Explore the world of digital communication and signal processing through probabilistic models, entropy calculations, Huffman coding, and compression ratios. Delve into the analysis of English text, source probabilities, instantaneously parsable codes, and more. Gain valuable knowledge on information entropy and its practical applications in data encoding and compression.

  • Seminar
  • Digital Communication
  • Signal Processing
  • Probabilistic Models
  • Huffman Coding

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Seminar 2 Digital Communication and Signal Processing Ruohan Zhang Ruohan.Zhang.1@warwick.ac.uk Department of Computer Science, University of Warwick 22/01/2024

  2. 1. The datafile midsummer contains the text of Shakespeares comedy A Midsummer Night s Dream (including text-formatting characters). Based upon this data, construct a probabilistic model for English text and calculate the corresponding entropy. Key Point The general definition of the information entropy is expressed in terms of a discrete set of probabilities ??so that ? ? = ?(?)log?(?) ? ?

  3. 2. Load the datafile message. The data represent the output of an unknown source. (1) Construct a probability model for the source and compute the corresponding entropy. (2) Build an instantaneously parsable code for the given source. Key point instantaneously parsable code Huffman coding (3) Encode the data by using the code proved. a) What compression ratio have you achieved? b) What is the maximum compression ratio you can achieve for this source using such kind of codes? Key point Huffman code tree Shannon's first theorem

  4. source YYNNNYYNNNYYNNYYNNNNYYYYYYNN NY Split into 2-symbol combination YY NN NY YN NN YY NN YY NN NN YY YY YY NN NY The four different kinds YY YN NY NN Split into 3-symbol combination YYN NNY YNN NYY NNY YNN NNY YYY YYN YNN Y The eight different kinds YYY NYY YNY YYN NNN

  5. A Huffman code tree for 2-symbol combination YY YN NY NN 0.5604 0.1932 0.1862 0.0582 NN 0.0582 NY 0.1862 0 1 0.2444

  6. A Huffman code tree for 2-symbol combination YY YN NY NN 0.5604 0.1932 0.1862 0.0582 NN 0.0582 NY 0.1862 0 1 YY YN 0.5604 0.1932 0.2444

  7. A Huffman code tree for 2-symbol combination YY YN NY NN 0.5604 0.1932 0.1862 0.0582 NN 0.0582 NY 0.1862 0 1 YY YN 0.5604 0.1932 0.2444 1 0 0.4396

  8. A Huffman code tree for 2-symbol combination YY YN NY NN 0.5604 0.1932 0.1862 0.0582 NN 0.0582 NY 0.1862 0 1 YN 0.1952 0.2444 1 0 YY 0.5604 0.4396 0 1 1

  9. A Huffman code tree for 2-symbol combination YY YN NY NN 0.5604 0.1932 0.1862 0.0582 NN 0.0582 NY 0.1862 0 1 YN 0.1952 0.2444 1 0 YY 0.5604 0.4396 YY YN NY NN 0 10 110 111 0 1 1

  10. source YYNNNYYNNNYYNNYYNNNNYYYYYYNN NY Split into 2-symbol combination YY NN NY YN NN YY NN YY NN NN YY YY YY NN NY YY YN NY NN 0 10 110 111 Huffman codeword 0 111 110 110 111 0 111 0 111 111 0 0 0 111 110

  11. Exercise You can try to construct a Huffman code tree for the 3-symbol combination.

Related


More Related Content