Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Papers We Love Too: Shannon's Mathematical Theory of Communication

Papers We Love Too: Shannon's Mathematical Theory of Communication

Papers We Love Too: June 2016

Kiran Bhattaram

June 23, 2016
Tweet

More Decks by Kiran Bhattaram

Other Decks in Science

Transcript

  1. Papers We Love Too: June 2016 A Mathematical Theory of

    Communication presented by Kiran Bhattaram
  2. Contributions 1. All communication is the same. 2. Information is

    measurable. 3. Information can be transmitted without error over noisy channels.
  3. An Overview! Modulation! data estimated data Compress Huffman, LZW, etc

    Channel Coding Viterbi, Turbo codes, LPDC, etc.
  4. An Overview! NOISY CHANNELS Modulation! data estimated data Compress Huffman,

    LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.
  5. An Overview! NOISY CHANNELS Modulation! Demodulation! Decoding Decompression data estimated

    data Compress Huffman, LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.
  6. A Series of Approximations to English XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYD

    QPAAMKBZAACIBZLHJQD. 1. Symbols independent and equiprobable.
  7. A Series of Approximations to English OCRO HLI RGWR NMIELWIS

    EU LL NBNESEBYA TH EEI ALHENHTTPA OOBTTVA NAH BRL. 2. Symbols independent and with the frequency of English text.
  8. A Series of Approximations to English ON IE ANTSOUTINYS ARE

    T INCTORE ST BE S DEAMY ACHIN D ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDY TOBE SEACE CTISBE. 3. Digram structure as in English
  9. A Series of Approximations to English Representing and speedily is

    an good apt or come can different natural Here he the a in came the to of to expert gray come to furnishes the line message had be these 4. Word frequency as English; independent words
  10. A Series of Approximations to English The head and in

    frontal attack on an English writer that the character of this point is therefore another method for the letters that the time of whoever told the problem for an unexpected 5. Word digrams as in English
  11. Huffman Codes (1951) thai szechuan pizza ice cream 0 1

    0 0 1 1 Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3 ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'
  12. Huffman Codes Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3

    ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'
  13. Huffman Codes Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3

    ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'
  14. - Claude Shannon Von Neumann told me, ‘You should call

    it entropy, for two reasons. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.’ In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name.
  15. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  16. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  17. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  18. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  19. Conditional Entropy entropy of source uncertainty in the received signal,

    if the message is known
 (uncertainty added by noise)
  20. Conditional Entropy entropy of source uncertainty in the received signal,

    if the message is known
 (uncertainty added by noise) entropy of received message
  21. Conditional Entropy entropy of source uncertainty in the message,
 if

    the received signal is known uncertainty in the received signal, if the message is known
 (uncertainty added by noise) entropy of received message
  22. The surprising thing about capacity P(error) —> 0 overhead —>

    ∞ a simple error-correcting code: add parity bits 0 0 0 0
  23. Shannon-Hartley Theorem C = B log2 (1+ ) Channel capacity

    in bits/s bandwidth of the channel S_ N
  24. Shannon-Hartley Theorem C = B log2 (1+ ) Channel capacity

    in bits/s bandwidth of the channel signal-to-noise ratio S_ N
  25. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  26. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  27. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  28. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  29. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  30. Images from Mars 1964: Mariner IV 1969: Mariner VI using

    no encoding using Reed-Muller encoding
  31. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  32. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  33. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  34. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.
  35. Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress

    Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.