Slide 1

Slide 1 text

One thing to rule it all: entropy Piotr Migdał 4th Offtopicarium 5 Jan 2014, Słomczyn near Warsaw (or: the beauty of not knowing)

Slide 2

Slide 2 text

No content

Slide 3

Slide 3 text

No content

Slide 4

Slide 4 text

About • Thermodynamics • Information theory • Statistics • ...

Slide 5

Slide 5 text

Why things happen? • Because they have to • Because the everything else is extremely unlikely

Slide 6

Slide 6 text

Entropy ~ logarithm of the number of possibilities higher entropy: more likely it will happen in large systems unlikely things became impossible (throwing 3 tails in a row vs throwing 1,000,000 tails in a row)

Slide 7

Slide 7 text

http://ensemble.va.com.au/Treister/HEXEN2/HEXEN_2.html

Slide 8

Slide 8 text

Shannon entropy S = X x p(x) log p(x) log2 1 = 0 log2 1 2 = 1 log2 1 4 = 2 log2 2 = 1 log2 4 = 2 = X x p(x) log 1 p ( x ) (the general concept for ANY probability) log2 8 = 3 log2 ) in bits ln = loge ) in nits log1 0 ) in bans

Slide 9

Slide 9 text

Examples • A coin: • A die: • Dying one day: 6 1 6 log2(6) ⇡ 2 . 6 1 2 log2(2) + 1 2 log2(2) = 1 0 log2(0) + 1 log2(1) = 0 S = log( n ) For n equaly probable events:

Slide 10

Slide 10 text

20 questions Is it alive? Is it a ring? 1 2 log2(2) + 1 2 log2(2) = 1 0 . 999 log2 1 0.999 + 0 . 001 log2 1 0.001 ⇡ 0 . 01

Slide 11

Slide 11 text

Sequence compression 01001010001101100101111... A - 00 C - 01 G - 10 T - 11 effective number of characters 1 bit per char p(0) = p(1) = 1/2 ACTAGATACTG... 2 bit per char p(A) = p(C) = p(G) = p(T) = 1/4 00011100100011011110...

Slide 12

Slide 12 text

Sequence compression zzzhzzhzzhhzzhzzzhzzzzhzhzz p(z) = 3/4 p(h) = 1/4 zz - 0 zh - 10 h - 11 0.81 bit per char 010011011110110100010100

Slide 13

Slide 13 text

Heat dS = dQ T

Slide 14

Slide 14 text

Phase space and phase transitions position momentum (“velocity”) ice water in low temperature

Slide 15

Slide 15 text

Phase space and phase transitions position momentum (“velocity”) freezing water by looking at it (when you know the actual state, you can extract more energy) ice water in high temperature

Slide 16

Slide 16 text

Other uses • Economic diversity (inequality) • Population diversity S2 = log p2 1 + . . . + p2 n (collision entropy) (purity) pi = ni n (Gini coefficient) 1/(p2 1 + . . . + p2 n )

Slide 17

Slide 17 text

gzip + poetry = awesome http://jvns.ca/blog/2013/10/24/day-16-gzip-plus-poetry-equals-awesome/

Slide 18

Slide 18 text

Mutual information yes no yes no mutual information = entropy 1 2 1 2 1 2 1 2 Bob Alice

Slide 19

Slide 19 text

Mutual information yes no yes no Bob mutual information: from 1 bit to 0 bit to 1 bit Alice p p 1 p 1 p 1 2 1 2 1 2 1 2

Slide 20

Slide 20 text

mutual information: bounded by the smaller entropy

Slide 21

Slide 21 text

Off-topic: http://blog.xkcd.com/2010/05/03/color-survey-results/

Slide 22

Slide 22 text

Good to know • Cleaning you room DOES NOT decrease its entropy (macroscoping order in a negligible contribution, but usually you heat it a bit) • Every known sequence has entropy 0 • Unit of information, but typically used for lack of information 01001010001101100101111...

Slide 23

Slide 23 text

Homework • What is entropy: • of a shuffled card deck? • word usage? • your bookshelf? • mutual information between countries and religions?

Slide 24

Slide 24 text

Thank you! [email protected]

Slide 25

Slide 25 text

Jokes & comments • Natural logarithm and alternative medicine • Sauron and wedding ring • Beauty of not knowing