Upgrade to Pro — share decks privately, control downloads, hide ads and more …

One thing to rule it all - on entropy

Piotr Migdał
January 05, 2014

One thing to rule it all - on entropy

Entropy as in probability, compression, physics, economics and other fields.
From the 4th Offtopicarium, http://offtopicarium.wikidot.com/, more links on the program page: https://hackpad.com/Program-of-the-4th-Offtopicarium-vU9mQ2TmQjX.

Piotr Migdał

January 05, 2014
Tweet

More Decks by Piotr Migdał

Other Decks in Science

Transcript

  1. One thing to rule it all: entropy Piotr Migdał 4th

    Offtopicarium 5 Jan 2014, Słomczyn near Warsaw (or: the beauty of not knowing)
  2. Why things happen? • Because they have to • Because

    the everything else is extremely unlikely
  3. Entropy ~ logarithm of the number of possibilities higher entropy:

    more likely it will happen in large systems unlikely things became impossible (throwing 3 tails in a row vs throwing 1,000,000 tails in a row)
  4. Shannon entropy S = X x p(x) log p(x) log2

    1 = 0 log2 1 2 = 1 log2 1 4 = 2 log2 2 = 1 log2 4 = 2 = X x p(x) log 1 p ( x ) (the general concept for ANY probability) log2 8 = 3 log2 ) in bits ln = loge ) in nits log1 0 ) in bans
  5. Examples • A coin: • A die: • Dying one

    day: 6 1 6 log2(6) ⇡ 2 . 6 1 2 log2(2) + 1 2 log2(2) = 1 0 log2(0) + 1 log2(1) = 0 S = log( n ) For n equaly probable events:
  6. 20 questions Is it alive? Is it a ring? 1

    2 log2(2) + 1 2 log2(2) = 1 0 . 999 log2 1 0.999 + 0 . 001 log2 1 0.001 ⇡ 0 . 01
  7. Sequence compression 01001010001101100101111... A - 00 C - 01 G

    - 10 T - 11 effective number of characters 1 bit per char p(0) = p(1) = 1/2 ACTAGATACTG... 2 bit per char p(A) = p(C) = p(G) = p(T) = 1/4 00011100100011011110...
  8. Sequence compression zzzhzzhzzhhzzhzzzhzzzzhzhzz p(z) = 3/4 p(h) = 1/4 zz

    - 0 zh - 10 h - 11 0.81 bit per char 010011011110110100010100
  9. Phase space and phase transitions position momentum (“velocity”) freezing water

    by looking at it (when you know the actual state, you can extract more energy) ice water in high temperature
  10. Other uses • Economic diversity (inequality) • Population diversity S2

    = log p2 1 + . . . + p2 n (collision entropy) (purity) pi = ni n (Gini coefficient) 1/(p2 1 + . . . + p2 n )
  11. Mutual information yes no yes no Bob mutual information: from

    1 bit to 0 bit to 1 bit Alice p p 1 p 1 p 1 2 1 2 1 2 1 2
  12. Good to know • Cleaning you room DOES NOT decrease

    its entropy (macroscoping order in a negligible contribution, but usually you heat it a bit) • Every known sequence has entropy 0 • Unit of information, but typically used for lack of information 01001010001101100101111...
  13. Homework • What is entropy: • of a shuffled card

    deck? • word usage? • your bookshelf? • mutual information between countries and religions?
  14. Jokes & comments • Natural logarithm and alternative medicine •

    Sauron and wedding ring • Beauty of not knowing