Slide 1

Slide 1 text

Papers We Love Too: June 2016 A Mathematical Theory of Communication presented by Kiran Bhattaram

Slide 2

Slide 2 text

Kiran Bhattaram @kiranb

Slide 3

Slide 3 text

Agenda A Brief History The Paper Impact

Slide 4

Slide 4 text

A Brief History 1800s to 1948

Slide 5

Slide 5 text

discovering limits heat engines! speed of light! temperature! incompleteness!

Slide 6

Slide 6 text

communications telegraphy telephone wireless telegraphy AM radio FM radio television

Slide 7

Slide 7 text

“just crank up the volume’ Transatlantic Telegraph 1858

Slide 8

Slide 8 text

No content

Slide 9

Slide 9 text

Transmission Speeds 0.064 bits/s 10 bit/s 10 billion bits/s

Slide 10

Slide 10 text

A Mathematical Theory of Communication Claude Shannon, Bell Labs System Journal (1948)

Slide 11

Slide 11 text

A Mathematical Theory of Communication Claude Shannon, Bell Labs System Journal (1948) The

Slide 12

Slide 12 text

No content

Slide 13

Slide 13 text

Contributions

Slide 14

Slide 14 text

Contributions 1. All communication is the same.

Slide 15

Slide 15 text

Contributions 1. All communication is the same. 2. Information is measurable.

Slide 16

Slide 16 text

Contributions 1. All communication is the same. 2. Information is measurable. 3. Information can be transmitted without error over noisy channels.

Slide 17

Slide 17 text

1. All communication is the same

Slide 18

Slide 18 text

No content

Slide 19

Slide 19 text

An Overview! data estimated data

Slide 20

Slide 20 text

An Overview! data estimated data Compress Huffman, LZW, etc

Slide 21

Slide 21 text

An Overview! data estimated data Compress Huffman, LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 22

Slide 22 text

An Overview! Modulation! data estimated data Compress Huffman, LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 23

Slide 23 text

An Overview! NOISY CHANNELS Modulation! data estimated data Compress Huffman, LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 24

Slide 24 text

An Overview! NOISY CHANNELS Modulation! Demodulation! Decoding Decompression data estimated data Compress Huffman, LZW, etc Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 25

Slide 25 text

2. Information is Measurable

Slide 26

Slide 26 text

A Series of Approximations to English

Slide 27

Slide 27 text

A Series of Approximations to English XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYD QPAAMKBZAACIBZLHJQD. 1. Symbols independent and equiprobable.

Slide 28

Slide 28 text

A Series of Approximations to English OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEI ALHENHTTPA OOBTTVA NAH BRL. 2. Symbols independent and with the frequency of English text.

Slide 29

Slide 29 text

A Series of Approximations to English ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIN D ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDY TOBE SEACE CTISBE. 3. Digram structure as in English

Slide 30

Slide 30 text

A Series of Approximations to English Representing and speedily is an good apt or come can different natural Here he the a in came the to of to expert gray come to furnishes the line message had be these 4. Word frequency as English; independent words

Slide 31

Slide 31 text

A Series of Approximations to English The head and in frontal attack on an English writer that the character of this point is therefore another method for the letters that the time of whoever told the problem for an unexpected 5. Word digrams as in English

Slide 32

Slide 32 text

Markov Processes

Slide 33

Slide 33 text

Markov Processes

Slide 34

Slide 34 text

Markov Processes

Slide 35

Slide 35 text

Markov Processes

Slide 36

Slide 36 text

Encoding Messages Dinner Probability Encoding thai 1/2 ’00' szechuan 1/3 ’01' pizza 1/12 ’10' ice cream 1/12 ’01'

Slide 37

Slide 37 text

Huffman Codes (1951) thai szechuan pizza ice cream 0 1 0 0 1 1 Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3 ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'

Slide 38

Slide 38 text

Huffman Codes Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3 ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'

Slide 39

Slide 39 text

Huffman Codes Dinner Probability Encoding thai 1/2 ’0' szechuan 1/3 ’10' pizza 1/12 ’110' ice cream 1/12 ‘111'

Slide 40

Slide 40 text

Information content

Slide 41

Slide 41 text

- Claude Shannon Von Neumann told me, ‘You should call it entropy, for two reasons. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.’ In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name.

Slide 42

Slide 42 text

Source Coding Theorem setting limits on compression

Slide 43

Slide 43 text

Source Coding Theorem setting limits on compression the average codeword length < average information per symbol

Slide 44

Slide 44 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 45

Slide 45 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 46

Slide 46 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 47

Slide 47 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 48

Slide 48 text

3. Information can be transmitted without error * up to a certain limit

Slide 49

Slide 49 text

Conditional Probabilities 0 1 0 1 Sent Received

Slide 50

Slide 50 text

Conditional Probabilities 0 1 0 1 0.9 Sent Received

Slide 51

Slide 51 text

Conditional Probabilities 0 1 0 1 0.9 0.1 Sent Received

Slide 52

Slide 52 text

Conditional Probabilities 0 1 0 1 0.9 0.9 0.1 Sent Received

Slide 53

Slide 53 text

Conditional Entropy

Slide 54

Slide 54 text

Conditional Entropy

Slide 55

Slide 55 text

Conditional Entropy

Slide 56

Slide 56 text

Conditional Entropy entropy of source

Slide 57

Slide 57 text

Conditional Entropy entropy of source uncertainty in the received signal, if the message is known
 (uncertainty added by noise)

Slide 58

Slide 58 text

Conditional Entropy entropy of source uncertainty in the received signal, if the message is known
 (uncertainty added by noise) entropy of received message

Slide 59

Slide 59 text

Conditional Entropy entropy of source uncertainty in the message,
 if the received signal is known uncertainty in the received signal, if the message is known
 (uncertainty added by noise) entropy of received message

Slide 60

Slide 60 text

Channel Capacity entropy of received message uncertainty in the message,
 if the received signal is known

Slide 61

Slide 61 text

The surprising thing about capacity a simple error-correcting code: add parity bits 0 P(error) = 1/4

Slide 62

Slide 62 text

The surprising thing about capacity a simple error-correcting code: add parity bits 0 0 P(error) = 1/16

Slide 63

Slide 63 text

The surprising thing about capacity a simple error-correcting code: add parity bits 0 0 0 P(error) = 1/64

Slide 64

Slide 64 text

The surprising thing about capacity a simple error-correcting code: add parity bits 0 0 0 P(error) = 0 1/256

Slide 65

Slide 65 text

The surprising thing about capacity a simple error-correcting code: add parity bits 0 0 0 0

Slide 66

Slide 66 text

The surprising thing about capacity P(error) —> 0 overhead —> ∞ a simple error-correcting code: add parity bits 0 0 0 0

Slide 67

Slide 67 text

Shannon-Hartley Theorem C = B log2 (1+ ) S_ N

Slide 68

Slide 68 text

Shannon-Hartley Theorem C = B log2 (1+ ) Channel capacity in bits/s S_ N

Slide 69

Slide 69 text

Shannon-Hartley Theorem C = B log2 (1+ ) Channel capacity in bits/s bandwidth of the channel S_ N

Slide 70

Slide 70 text

Shannon-Hartley Theorem C = B log2 (1+ ) Channel capacity in bits/s bandwidth of the channel signal-to-noise ratio S_ N

Slide 71

Slide 71 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 72

Slide 72 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 73

Slide 73 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 74

Slide 74 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 75

Slide 75 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 76

Slide 76 text

Impact

Slide 77

Slide 77 text

Hamming Codes

Slide 78

Slide 78 text

Hamming Codes D1 P4 P3 P1 P2 D2 D3 D4

Slide 79

Slide 79 text

Hamming Codes 0 1 0 1 0 1 1 1

Slide 80

Slide 80 text

Hamming Codes 0 1 0 1 0 0 1 1

Slide 81

Slide 81 text

Convolutional Codes 011100101010010100010 + p0 p1 +

Slide 82

Slide 82 text

Convolutional Codes 011100101010010100010 + p0 p1 +

Slide 83

Slide 83 text

No content

Slide 84

Slide 84 text

“Just use more power; more bandwidth”

Slide 85

Slide 85 text

No content

Slide 86

Slide 86 text

Pioneer IX 1958:

Slide 87

Slide 87 text

1962: Mariner 2

Slide 88

Slide 88 text

Mariner VI 1969:

Slide 89

Slide 89 text

Images from Mars 1964: Mariner IV 1969: Mariner VI using no encoding using Reed-Muller encoding

Slide 90

Slide 90 text

The Grand Tour Courtesy NASA/ JPL-Caltech

Slide 91

Slide 91 text

Voyager 1

Slide 92

Slide 92 text

No content

Slide 93

Slide 93 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 94

Slide 94 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 95

Slide 95 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 96

Slide 96 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 97

Slide 97 text

Review! NOISY CHANNELS Modulation! Demodulation! Decompression data estimated data Compress Huffman, LZW, etc Decoding Channel Coding Viterbi, Turbo codes, LPDC, etc.

Slide 98

Slide 98 text

No content

Slide 99

Slide 99 text

Weird Shit/Current Areas of Research

Slide 100

Slide 100 text

is information physical? can you measure the energy of a bit?

Slide 101

Slide 101 text

black holes and information

Slide 102

Slide 102 text

reversible logic