Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Statistical Rethinking 2023 - Lecture 08

Statistical Rethinking 2023 - Lecture 08

Richard McElreath

January 25, 2023
Tweet

More Decks by Richard McElreath

Other Decks in Education

Transcript

  1. Statistical Rethinking 8. Markov chain Monte Carlo 2023

  2. None
  3. 15 20 25 30 6 8 10 12 Marriage rate

    Divorce rate AL AK AR CO CT DE DC GA HI ID KY ME MN NJ ND OK RI TN UT VA WY Age at marriage M D A ? Marriage Divorce 23 24 25 26 27 28 29 6 8 10 12 Median age of marriage Divorce rate AL AR CT DC ID ME MA MN NJ ND OK RI UT WY 23 24 25 26 27 28 29 15 20 25 30 Median age of marriage Marriage rate AK AR DE DC HI ID ME MA MN NJ NY ND OK PA RI UT WY
  4. M D A M* D* A* Marriage observed Age at

    marriage observed Divorce observed
  5. M D A M* D* A* Marriage observed Age at

    marriage observed Divorce observed P Population
  6. K Knowledge Real, latent modeling problems

  7. K S Score Knowledge Real, latent modeling problems

  8. K S Score Q Knowledge Test/Instrument Real, latent modeling problems

  9. K S Score Q Knowledge Test/Instrument X Person feature Real,

    latent modeling problems
  10. Q Y Evaluation R Quality Referee X Feature Real, latent

    modeling problems
  11. Social networks YAB RAB Relationship A→B Behavior A→B

  12. Social networks YAB RAB RBA Relationship B→A Relationship A→B Behavior

    A→B
  13. Social networks YAB XA XB RAB RBA Relationship B→A Relationship

    A→B Features of B Features of A
  14. Social networks YAB XA XB RAB RBA Relationship B→A Relationship

    A→B Features of B Features of A
  15. Problems & solutions Real research problems: Many unknowns Nested relationships

    Bounded outcomes Di cult calculations K S Q X YAB XA XB RAB RBA
  16. aNAlYzE tHe DatA

  17. Computing the posterior 1. Analytical approach (o en impossible) 2.

    Grid approximation (very intensive) 3. Quadratic approximation (limited) 4. Markov chain Monte Carlo (intensive)
  18. King Markov

  19. e Metropolis Archipelago

  20. Contract: King Markov must visit each island in proportion to

    its population size Here’s how he does it...
  21. (1) Flip a coin to choose island on le or

    right. Call it the “proposal” island. 1 2 1 2
  22. 1 2 3 4 5 6 7 p5 proposal (1)

    Flip a coin to choose island on le or right. Call it the “proposal” island. (2) Find population of proposal island.
  23. p5 (1) Flip a coin to choose island on le

    or right. Call it the “proposal” island. p4 1 2 3 4 5 6 7 proposal (3) Find population of current island. (2) Find population of proposal island.
  24. (2) Find population of proposal island. (1) Flip a coin

    to choose island on le or right. Call it the “proposal” island. p5 p4 1 2 3 4 5 6 7 proposal (3) Find population of current island. (4) Move to proposal, with probability = p5 p4
  25. (2) Find population of proposal island. (1) Flip a coin

    to choose island on le or right. Call it the “proposal” island. (3) Find population of current island. 1 2 3 4 5 6 7 (4) Move to proposal, with probability = p5 p4 (5) Repeat from (1)
  26. (2) Find population of proposal island. (1) Flip a coin

    to choose island on le or right. Call it the “proposal” island. (3) Find population of current island. (4) Move to proposal, with probability = p5 p4 1 2 3 4 5 6 7 (5) Repeat from (1) is procedure ensures visiting each island in proportion to its population, in the long run.
  27. None
  28. None
  29. Markov chain Monte Carlo Usual use: Draw samples from a

    posterior distribution “Islands”: parameter values “Population size”: posterior probability Visit each parameter value in proportion to its posterior probability Any number of dimensions (parameters)
  30. “Markov chain Monte Carlo” Chain: Sequence of draws from distribution

    Markov chain: History doesn’t matter, just where you are now Monte Carlo: Random simulation
  31. “Markov chain Monte Carlo” Metropolis algorithm: Simple version of Markov

    chain Monte Carlo (MCMC) Easy to write, very general, o en ine cient
  32. Metropolis, Rosenbluth, Rosenbluth, Teller and Teller (1953) OURNAL OF CHEMICAL

    PHYSICS VOLUME 21, NUMBER 6 JUNE Equation of State Calculations by Fast Computing Machines NICHOLAS METROPOLIS, ARIANNA W. ROSENBLUTH, MARSHALL N. ROSENBLUTH, AND AUGUSTA H. TELLER, Los Alamos Scientific Laboratory, Los Alamos, New Mexico AND EDWARD TELLER, * Department of Physics, University of Chicago, Chicago, Illinois (Received March 6, 1953) A general method, suitable for fast computing machines, for investigatiflg such properties as equations of state for substances consisting of interacting individual molecules is described. The method consists of a modified Monte Carlo integration over configuration space. Results for the two-dimensional rigid-sphere system have been obtained on the Los Alamos MANIAC and are presented here. These results are compared to the free volume equation of state and to a four-term virial coefficient expansion.
  33. Metropolis, Rosenbluth, Rosenbluth, Teller and Teller (1953) OURNAL OF CHEMICAL

    PHYSICS VOLUME 21, NUMBER 6 JUNE Equation of State Calculations by Fast Computing Machines NICHOLAS METROPOLIS, ARIANNA W. ROSENBLUTH, MARSHALL N. ROSENBLUTH, AND AUGUSTA H. TELLER, Los Alamos Scientific Laboratory, Los Alamos, New Mexico AND EDWARD TELLER, * Department of Physics, University of Chicago, Chicago, Illinois (Received March 6, 1953) A general method, suitable for fast computing machines, for investigatiflg such properties as equations of state for substances consisting of interacting individual molecules is described. The method consists of a modified Monte Carlo integration over configuration space. Results for the two-dimensional rigid-sphere system have been obtained on the Los Alamos MANIAC and are presented here. These results are compared to the free volume equation of state and to a four-term virial coefficient expansion.
  34. Arianna Rosenbluth (1927–2020)

  35. Arianna Rosenbluth (1927–2020)

  36. None
  37. Mathematical Analyzer, Numerical Integrator, and Computer MANIAC: 1000 pounds 5

    kilobytes of memory 70k multiplications/sec Your laptop: 4–7 pounds 8+ million kilobytes memory Billions of multiplications/sec
  38. MCMC is diverse Metropolis has yielded to newer, more e

    cient algorithms Many innovations in the last decades Best methods use gradients We’ll use Hamiltonian Monte Carlo
  39. Basic Rosenbluth (aka Metropolis) algorithm

  40. Basic Rosenbluth (aka Metropolis) algorithm large step size small step

    size
  41. -3 -2 -1 0 1 2 3 0.0 0.1 0.2

    0.3 0.4 value probability
  42. -3 -2 -1 0 1 2 3 0.0 0.1 0.2

    0.3 0.4 value probability -3 -2 -1 0 1 2 3 1 2 3 4 5 value -log probability
  43. x y z x y z

  44. None
  45. Low probability High probability

  46. Hamiltonian Monte Carlo

  47. Hamiltonian Monte Carlo

  48. None
  49. 0WFSUIJOLJOH )BNJMUPOJBO .POUF $BSMP JO UIF SBX ćF ).$ BMHPSJUIN

    OFFET ĕWF UIJOHT UP HP  B GVODUJPO U UIBU SFUVSOT UIF OFHBUJWF MPHQSPCBCJMJUZ PG UIF EBUB BU UIF DVSSFOU QPTJUJPO QBSBNFUFS WBMVFT  B GVODUJPO grad_U UIBU SFUVSOT UIF HSBEJFOU PG UIF OFHBUJWF MPHQSPCBCJMJUZ BU UIF DVSSFOU QPTJUJPO  B TUFQ TJ[F epsilon  B DPVOU PG MFBQGSPH TUFQT L BOE  B TUBSUJOH QPTJUJPO current_q ,FFQ JO NJOE UIBU UIF QPTJUJPO JT B WFDUPS PG QBSBNFUFS WBMVFT BOE UIBU UIF HSBEJFOU BMTP OFFET UP SFUVSO B WFDUPS PG UIF TBNF MFOHUI 4P UIBU UIFTF U BOE grad_U GVODUJPOT NBLF NPSF TFOTF MFUT QSFTFOU UIFN ĕSTU CVJMU DVTUPN GPS UIF % (BVTTJBO FYBNQMF ćF U GVODUJPO KVTU FYQSFTTFT UIF MPH QPTUFSJPS BT TUBUFE CFGPSF JO UIF NBJO UFYU J MPH Q(ZJ|µZ, ) + J MPH Q(YJ|µY, ) + MPH Q(µZ|, .) + MPH Q(µY, , .) 4P JUT KVTU GPVS DBMMT UP dnorm SFBMMZ 3 DPEF  # U needs to return neg-log-probability U <- function( q , a=0 , b=1 , k=0 , d=1 ) { muy <- q[1] mux <- q[2] U <- sum( dnorm(y,muy,1,log=TRUE) ) + sum( dnorm(x,mux,1,log=TRUE) ) + dnorm(muy,a,b,log=TRUE) + dnorm(mux,k,d,log=TRUE) return( -U ) } /PX UIF HSBEJFOU GVODUJPO SFRVJSFT UXP QBSUJBM EFSJWBUJWFT -VDLJMZ (BVTTJBO EFSJWBUJWFT BSF WFSZ DMFBO ćF EFSJWBUJWF PG UIF MPHBSJUIN PG BOZ VOJWBSJBUF (BVTTJBO XJUI NFBO B BOE TUBOEBSE EFWJBUJPO C XJUI SFTQFDU UP B JT ∂ MPH /(Z|B, C) ∂B = Z − B C "OE TJODF UIF EFSJWBUJWF PG B TVN JT B TVN PG EFSJWBUJWFT UIJT JT BMM XF OFFE UP XSJUF UIF HSBEJFOUT ∂6 ∂µY = ∂ MPH /(Y|µY, ) ∂µY + ∂ MPH /(µY|, .) ∂µY = J YJ − µY  +  − µY . Pages 276–278
  50. Calculus is a superpower Hamiltonian Monte Carlo needs gradients How

    does it get them? Write them yourself or… Auto-di : Automatic di erentiation Symbolic derivatives of your model code Used in many machine learning approaches; “Backpropagation” is special case
  51. Stanisław Ulam (1909–1984) mc-stan.org

  52. Stanisław Ulam and his daughter Claire with MANIAC

  53. PAUSE

  54. Example: e Judgment at Princeton

  55. library(rethinking) data(Wines2012) Example: e Judgment at Princeton

  56. library(rethinking) data(Wines2012) Example: e Judgment at Princeton 20 wines (10

    French, 10 NJ) 9 French & American judges 180 scores
  57. Q S Score J Wine quality Judge

  58. Q S Score J Wine quality Judge X Wine origin

  59. Q S Score J Wine quality Judge X Wine origin

    Z Judge origin
  60. Q S Score J Wine quality Judge X Wine origin

    Z Judge origin
  61. Q S Score J Wine quality Judge X Wine origin

    Z Judge origin
  62. Q S Score J Wine quality Judge X Wine origin

    Z Judge origin Estimand: Association between wine quality and wine origin. Stratify by judge for e ciency.
  63. library(rethinking) data(Wines2012) d <- Wines2012 dat <- list( S=standardize(d$score), J=as.numeric(d$judge),

    W=as.numeric(d$wine), X=ifelse(d$wine.amer==1,1,2), Z=ifelse(d$judge.amer==1,1,2) ) mQ <- ulam( alist( S ~ dnorm(mu,sigma), mu <- Q[W], Q[W] ~ dnorm(0,1), sigma ~ dexp(1) ) , data=dat , chains=4 , cores=4 ) precis(mQ,2) Q S J X Z Si ⇠ Normal(µi, ) µi = QW [i] Qj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="BcFQVMtrbGcJ2vRPJfhp5n2qZ+Y=">AAACZnicbVHLS+NAHJ5kfdZXtSwevAwWRUFKooJ7WSiK4EksWis0IUym03Z0HmHml8US8k/ubc9e/DN2Gnvw9YOBL99jMvNNmgluIQj+ef6PufmFxaXl2srq2vpGfXPr3urcUNalWmjzkBLLBFesCxwEe8gMIzIVrJc+XUz13h9mLNfqDiYZiyUZKT7klICjknp5m3C8H1kucQTsGYprbSQR5UEk84QfOWEkyWEU1apvvP8bd5Ki1+dx6bhO8vhtNjgKq0gV/uC4fM60Ygr41BYeJvVm0AqqwV9BOANNNJubpP43GmiaS7cFFcTafhhkEBfEAKeClbUotywj9ImMWN9BRSSzcVHVVOI9xwzwUBu3FOCKfZ8oiLR2IlPnlATG9rM2Jb/T+jkMf8UFV1kOTNG3Hw1zgUHjaed4wA2jICYOEGq4OyumY2IIBfcyNVdC+PnKX8H9cSs8aR13Tpvt81kdS2gH7aIDFKIz1EZX6AZ1EUUv3rK35TW8V3/d/+lvv1l9b5ZpoA/j4//dXrZe</latexit>
  64. library(rethinking) data(Wines2012) d <- Wines2012 dat <- list( S=standardize(d$score), J=as.numeric(d$judge),

    W=as.numeric(d$wine), X=ifelse(d$wine.amer==1,1,2), Z=ifelse(d$judge.amer==1,1,2) ) mQ <- ulam( alist( S ~ dnorm(mu,sigma), mu <- Q[W], Q[W] ~ dnorm(0,1), sigma ~ dexp(1) ) , data=dat , chains=4 , cores=4 ) precis(mQ,2) Q S J X Z Si ⇠ Normal(µi, ) µi = QW [i] Qj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="BcFQVMtrbGcJ2vRPJfhp5n2qZ+Y=">AAACZnicbVHLS+NAHJ5kfdZXtSwevAwWRUFKooJ7WSiK4EksWis0IUym03Z0HmHml8US8k/ubc9e/DN2Gnvw9YOBL99jMvNNmgluIQj+ef6PufmFxaXl2srq2vpGfXPr3urcUNalWmjzkBLLBFesCxwEe8gMIzIVrJc+XUz13h9mLNfqDiYZiyUZKT7klICjknp5m3C8H1kucQTsGYprbSQR5UEk84QfOWEkyWEU1apvvP8bd5Ki1+dx6bhO8vhtNjgKq0gV/uC4fM60Ygr41BYeJvVm0AqqwV9BOANNNJubpP43GmiaS7cFFcTafhhkEBfEAKeClbUotywj9ImMWN9BRSSzcVHVVOI9xwzwUBu3FOCKfZ8oiLR2IlPnlATG9rM2Jb/T+jkMf8UFV1kOTNG3Hw1zgUHjaed4wA2jICYOEGq4OyumY2IIBfcyNVdC+PnKX8H9cSs8aR13Tpvt81kdS2gH7aIDFKIz1EZX6AZ1EUUv3rK35TW8V3/d/+lvv1l9b5ZpoA/j4//dXrZe</latexit> mean sd 5.5% 94.5% n_eff Rhat4 Q[1] 0.14 0.30 -0.34 0.64 2962 1 Q[2] 0.11 0.32 -0.40 0.61 3033 1 Q[3] 0.27 0.31 -0.21 0.75 2608 1 Q[4] 0.55 0.33 0.04 1.09 2598 1 Q[5] -0.13 0.32 -0.63 0.37 2726 1 Q[6] -0.37 0.32 -0.88 0.12 2710 1 Q[7] 0.29 0.32 -0.22 0.81 2679 1 Q[8] 0.27 0.32 -0.24 0.79 3248 1 Q[9] 0.08 0.30 -0.40 0.56 3410 1 Q[10] 0.12 0.32 -0.40 0.63 2697 1 Q[11] -0.02 0.31 -0.52 0.50 2740 1 Q[12] -0.03 0.32 -0.53 0.48 2809 1 Q[13] -0.11 0.32 -0.62 0.39 3225 1 Q[14] 0.01 0.33 -0.51 0.52 2690 1 Q[15] -0.22 0.31 -0.71 0.28 2754 1 Q[16] -0.21 0.32 -0.72 0.30 2120 1 Q[17] -0.14 0.31 -0.64 0.37 3831 1 Q[18] -0.86 0.32 -1.37 -0.35 3093 1 Q[19] -0.16 0.30 -0.65 0.31 2783 1 Q[20] 0.38 0.32 -0.12 0.88 2873 1 sigma 1.00 0.06 0.91 1.09 2759 1
  65. Drawing the Markov Owl Complex machinery, but lots of diagnostics

    (1) Trace plots (2) Trace rank plots (3) R-hat convergence measure (4) Number of e ective samples (5) Divergent transitions
  66. 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff

    = 2962 Q[1] 0 200 400 600 800 1000 -1.5 0.0 1.0 n_eff = 3033 Q[2] 0 200 400 600 800 1000 -2 -1 0 1 n_eff = 2608 Q[3] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2598 Q[4] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2726 Q[5] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2679 Q[7] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3248 Q[8] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3410 Q[9] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2697 Q[10] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2740 Q[11] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2809 Q[12] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 3225 Q[13] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 2690 Q[14] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 2754 Q[15] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2120 Q[16] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3831 Q[17] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 3093 Q[18] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2783 Q[19] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2873 Q[20] Trace plots
  67. 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff

    = 2962 Q[1] 0 200 400 600 800 1000 -1.5 0.0 1.0 n_eff = 3033 Q[2] 0 200 4 -2 -1 0 1 Q[3] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2726 Q[5] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 200 4 -1.0 0.0 1.0 Q[7] 1.0 n_eff = 3410 Q[9] 1.0 n_eff = 2697 Q[10] 1.0 Q[11]
  68. 800 1000 0 200 400 600 800 1000 -1.5 0

    20 -2 - 800 1000 eff = 2726 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 20 -1.0 0.0 1.0 Q[7] eff = 3410 n_eff = 2697 Q[10] 1.0 Q[11] Warmup Samples
  69. Need more than 1 chain to check convergence Convergence: Each

    chain explores the right distribution and every chain explores the same distribution library(rethinking) data(Wines2012) d <- Wines2012 dat <- list( S=standardize(d$score), J=as.numeric(d$judge), W=as.numeric(d$wine), X=ifelse(d$wine.amer==1,1,2), Z=ifelse(d$judge.amer==1,1,2) ) mQ <- ulam( alist( S ~ dnorm(mu,sigma), mu <- Q[W], Q[W] ~ dnorm(0,1), sigma ~ dexp(1) ) , data=dat , chains=4 , cores=4 ) precis(mQ,2)
  70. 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff

    = 2962 Q[1] 0 200 400 600 800 1000 -1.5 0.0 1.0 n_eff = 3033 Q[2] 0 200 400 600 800 1000 -2 -1 0 1 n_eff = 2608 Q[3] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2598 Q[4] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2726 Q[5] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2679 Q[7] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3248 Q[8] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3410 Q[9] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2697 Q[10] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2740 Q[11] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2809 Q[12] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 3225 Q[13] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 2690 Q[14] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 2754 Q[15] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2120 Q[16] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3831 Q[17] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 3093 Q[18] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2783 Q[19] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2873 Q[20]
  71. 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff

    = 2962 Q[1] 0 200 400 600 800 1000 -1.5 0.0 1.0 n_eff = 3033 Q[2] 0 200 400 600 800 1000 -2 -1 0 1 n_eff = 2608 Q[3] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2598 Q[4] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2726 Q[5] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2679 Q[7] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3248 Q[8] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3410 Q[9] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2697 Q[10] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2740 Q[11] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2809 Q[12] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 3225 Q[13] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 2690 Q[14] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 2754 Q[15] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2120 Q[16] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3831 Q[17] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 3093 Q[18] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2783 Q[19] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2873 Q[20]
  72. 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff

    = 2962 Q[1] 0 200 400 600 800 1000 -1.5 0.0 1.0 n_eff = 3033 Q[2] 0 200 400 600 800 1000 -2 -1 0 1 n_eff = 2608 Q[3] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2598 Q[4] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2726 Q[5] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2710 Q[6] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2679 Q[7] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3248 Q[8] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3410 Q[9] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2697 Q[10] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 2740 Q[11] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2809 Q[12] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 3225 Q[13] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 2690 Q[14] 0 200 400 600 800 1000 -1.5 -0.5 0.5 1.5 n_eff = 2754 Q[15] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2120 Q[16] 0 200 400 600 800 1000 -1.0 0.0 1.0 n_eff = 3831 Q[17] 0 200 400 600 800 1000 -2.0 -0.5 1.0 n_eff = 3093 Q[18] 0 200 400 600 800 1000 -1.5 -0.5 0.5 n_eff = 2783 Q[19] 0 200 400 600 800 1000 -1.0 0.0 1.0 2.0 n_eff = 2873 Q[20]
  73. 0 200 400 600 800 1000 -10 0 5 10

    n_eff = 3946 Q[1] 0 200 400 600 800 1000 -10 0 5 10 n_eff = 3842 Q[2] 0 200 400 600 800 1000 -200 0 200 n_eff = 114 Q[3] 0 200 400 600 800 1000 -300 0 200 n_eff = 178 Q[4] 0 200 400 600 800 1000 -300 0 200 n_eff = 141 Q[5] 0 200 400 600 800 1000 -300 0 200 n_eff = 145 Q[6] 0 200 400 600 800 1000 -200 0 200 n_eff = 114 Q[7] 0 200 400 600 800 1000 -300 0 200 n_eff = 200 Q[8] 0 200 400 600 800 1000 -200 0 200 n_eff = 133 Q[9] 0 200 400 600 800 1000 -300 0 200 n_eff = 135 Q[10] 0 200 400 600 800 1000 -200 200 n_eff = 146 Q[11] 0 200 400 600 800 1000 -300 0 200 n_eff = 230 Q[12] 0 200 400 600 800 1000 -400 -100 200 n_eff = 112 Q[13] 0 200 400 600 800 1000 -400 0 200 n_eff = 179 Q[14] 0 200 400 600 800 1000 -400 0 200 n_eff = 205 Q[15] 0 200 400 600 800 1000 -400 -100 200 n_eff = 186 Q[16] 0 200 400 600 800 1000 -300 0 200 n_eff = 108 Q[17] 0 200 400 600 800 1000 -300 0 200 n_eff = 119 Q[18] 0 200 400 600 800 1000 -300 0 200 n_eff = 196 Q[19] 0 200 400 600 800 1000 -300 0 200 n_eff = 92 Q[20]
  74. 600 800 1000 0 200 400 600 800 1000 -300

    0 0 200 400 6 -200 0 2 600 800 1000 n_eff = 133 0 200 400 600 800 1000 -300 0 200 n_eff = 135 Q[10] 0 200 400 6 -200 200 Q[11] n_eff = 112 0 200 n_eff = 179 Q[14] 0 200 Q[15]
  75. Trace rank (Trank) plots n_eff = 2962 Q[1] n_eff =

    3033 Q[2] Q n_eff = 2726 Q[5] n_eff = 2710 Q[6] Q Rank orders of samples. No chain should tend to be below/ above others.
  76. n_eff = 2962 Q[1] n_eff = 3033 Q[2] n_eff =

    2608 Q[3] n_eff = 2598 Q[4] n_eff = 2726 Q[5] n_eff = 2710 Q[6] n_eff = 2679 Q[7] n_eff = 3248 Q[8] n_eff = 3410 Q[9] n_eff = 2697 Q[10] n_eff = 2740 Q[11] n_eff = 2809 Q[12] n_eff = 3225 Q[13] n_eff = 2690 Q[14] n_eff = 2754 Q[15] n_eff = 2120 Q[16] n_eff = 3831 Q[17] n_eff = 3093 Q[18] n_eff = 2783 Q[19] n_eff = 2873 Q[20]
  77. n_eff = 3946 Q[1] n_eff = 3842 Q[2] n_eff =

    114 Q[3] n_eff = 178 Q[4] n_eff = 141 Q[5] n_eff = 145 Q[6] n_eff = 114 Q[7] n_eff = 200 Q[8] n_eff = 133 Q[9] n_eff = 135 Q[10] n_eff = 146 Q[11] n_eff = 230 Q[12] n_eff = 112 Q[13] n_eff = 179 Q[14] n_eff = 205 Q[15] n_eff = 186 Q[16] n_eff = 108 Q[17] n_eff = 119 Q[18] n_eff = 196 Q[19] n_eff = 92 Q[20]
  78. n_eff = 145 Q[6] n_eff = 114 Q[7] n_eff =

    135 Q[10] n_eff = 146 Q[11]
  79. R-hat When chains converge: (1) Start and end of each

    chain explores same region (2) Independent chains explore same region R-hat is a ratio of variances: As total variance shrinks to average variance within chains, R-hat approaches 1 NO GUARANTEES; NOT A TEST 0 200 400 600 800 1000 0.00 0.10 0.20 sample variance between within mean sd 5.5% 94.5% n_eff Rhat4 Q[1] 0.14 0.30 -0.34 0.64 2962 1 Q[2] 0.11 0.32 -0.40 0.61 3033 1 Q[3] 0.27 0.31 -0.21 0.75 2608 1
  80. n_e Estimate of number of e ective samples “How long

    would the chain be, if each sample was independent of the one before it?” When samples are autocorrelated, you have fewer e ective samples 0 5 10 15 20 25 30 0.0 0.4 0.8 Lag ACF Series post[, 1, 1] mean sd 5.5% 94.5% n_eff Rhat4 Q[1] 0.14 0.30 -0.34 0.64 2962 1 Q[2] 0.11 0.32 -0.40 0.61 3033 1 Q[3] 0.27 0.31 -0.21 0.75 2608 1
  81. mQO <- ulam( alist( S ~ dnorm(mu,sigma), mu <- Q[W]

    + O[X], Q[W] ~ dnorm(0,1), O[X] ~ dnorm(0,1), sigma ~ dexp(1) ) , data=dat , chains=4 , cores=4 ) plot(precis(mQO,2)) Q S J X Z Si ⇠ Normal(µi, ) µi = QW [i] + OX[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="HT7ymu0RAzlizOQ8EEUaIQXsUqo=">AAACkHicfVHbSiNBEO0ZL+vGW1Yf96UxKIoSZlTQhxVdRZB9WA0aE8gMQ0+nE3vty9BdI4Yh37P/45t/Y0+SB29Y0HD6nFNV3VVpJriFIHj2/Knpmdlvc98r8wuLS8vVHyu3VueGsibVQpt2SiwTXLEmcBCsnRlGZCpYK70/K/XWAzOWa3UDg4zFkvQV73FKwFFJ9f91wvFGZLnEEbBHKP5qI4kYbkYyT/iOE/qSbEVRZXTHG0e4kRStDo+HeBtfJkW7hE5uJP8+LRPshGX25dfyuM0bx/ljphVTwEtbuJVUa0E9GAX+CMIJqKFJXCXVp6iraS5dCSqItZ0wyCAuiAFOBRtWotyyjNB70mcdBxWRzMbFaKBDvO6YLu5p444CPGJfZxREWjuQqXNKAnf2vVaSn2mdHHqHccFVlgNTdNyolwsMGpfbwV1uGAUxcIBQw91bMb0jhlBwO6y4IYTvv/wR3O7Ww736bmO/dnI6Gccc+onW0CYK0QE6QRfoCjUR9Ra9Pe+Xd+Sv+If+sf97bPW9Sc4qehP+nxfJIcPT</latexit>
  82. Q S J X Z Si ⇠ Normal(µi, ) µi

    = QW [i] + OX[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="HT7ymu0RAzlizOQ8EEUaIQXsUqo=">AAACkHicfVHbSiNBEO0ZL+vGW1Yf96UxKIoSZlTQhxVdRZB9WA0aE8gMQ0+nE3vty9BdI4Yh37P/45t/Y0+SB29Y0HD6nFNV3VVpJriFIHj2/Knpmdlvc98r8wuLS8vVHyu3VueGsibVQpt2SiwTXLEmcBCsnRlGZCpYK70/K/XWAzOWa3UDg4zFkvQV73FKwFFJ9f91wvFGZLnEEbBHKP5qI4kYbkYyT/iOE/qSbEVRZXTHG0e4kRStDo+HeBtfJkW7hE5uJP8+LRPshGX25dfyuM0bx/ljphVTwEtbuJVUa0E9GAX+CMIJqKFJXCXVp6iraS5dCSqItZ0wyCAuiAFOBRtWotyyjNB70mcdBxWRzMbFaKBDvO6YLu5p444CPGJfZxREWjuQqXNKAnf2vVaSn2mdHHqHccFVlgNTdNyolwsMGpfbwV1uGAUxcIBQw91bMb0jhlBwO6y4IYTvv/wR3O7Ww736bmO/dnI6Gccc+onW0CYK0QE6QRfoCjUR9Ra9Pe+Xd+Sv+If+sf97bPW9Sc4qehP+nxfJIcPT</latexit> sigma O[2] O[1] Q[20] Q[19] Q[18] Q[17] Q[16] Q[15] Q[14] Q[13] Q[12] Q[11] Q[10] Q[9] Q[8] Q[7] Q[6] Q[5] Q[4] Q[3] Q[2] Q[1] -1.5 -1.0 -0.5 0.0 0.5 1.0 Value wines origins
  83. mQOJ <- ulam( alist( S ~ dnorm(mu,sigma), mu <- (Q[W]

    + O[X] - H[J])*D[J], Q[W] ~ dnorm(0,1), O[X] ~ dnorm(0,1), H[J] ~ dnorm(0,1), D[J] ~ dexp(1), sigma ~ dexp(1) ) , data=dat , chains=4 ) plot(precis(mQOJ,2)) Q S J X Z Si ⇠ Normal(µi , ) µi = (QW [i] + OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit>
  84. Si ⇠ Normal(µi , ) µi = (QW [i] +

    OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> expected score
  85. Si ⇠ Normal(µi , ) µi = (QW [i] +

    OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> expected score wine quality
  86. Si ⇠ Normal(µi , ) µi = (QW [i] +

    OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> expected score wine quality origin
  87. Si ⇠ Normal(µi , ) µi = (QW [i] +

    OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> expected score wine quality origin judge harshness judge discrimination
  88. Q S J X Z Si ⇠ Normal(µi , )

    µi = (QW [i] + OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> sigma D[9] D[8] D[7] D[6] D[5] D[4] D[3] D[2] D[1] H[9] H[8] H[7] H[6] H[5] H[4] H[3] H[2] H[1] O[2] O[1] Q[20] Q[19] Q[18] Q[17] Q[16] Q[15] Q[14] Q[13] Q[12] Q[11] Q[10] Q[9] Q[8] Q[7] Q[6] Q[5] Q[4] Q[3] Q[2] Q[1] -3 -2 -1 0 1 2 harshness discrimination
  89. Q S J X Z Si ⇠ Normal(µi , )

    µi = (QW [i] + OX[i] HJ[i] )DJ[i] Qj ⇠ Normal(0, 1) Oj ⇠ Normal(0, 1) Hj ⇠ Normal(0, 1) Dj ⇠ Normal(0, 1) ⇠ Exponential(1) <latexit sha1_base64="XWz8OxyHGRQInb7eOUH/NjeQC0A=">AAAC5XicfVJNbxMxEPVuC5QUaIAjF4uIKhEl2m0rwaVSBa0UcaCNSppI2ZXldZzU1B8r24sarXLnwgFU9dr/1Fv/Sw94N3ugH8pIlp7fezMej52knBkbBNeev7T86PGTlae11WfPX6zVX746NirThPaI4koPEmwoZ5L2LLOcDlJNsUg47SenXwq9/5Nqw5T8bqcpjQWeSDZmBFtHofrNEWJwPTJMwMjSM5t/U1pgPmtGIkNswwkTgVtRVCv3cH0HNmEX5f0hi2fwPTxA+aCEH2AH5V9L2NqrkMvqoh8PVg82wqLowWK5s1jeWyzPe7/l2D9LlaTSssIWtlC9EbSDMuB9EFagAao4RPWraKRIJlwJwrExwzBIbZxjbRnhdFaLMkNTTE7xhA4dlFhQE+flK83gO8eM4Fhpt6SFJft/Ro6FMVOROKfA9sTc1QryIW2Y2fGnOGcyzSyVZH7QOOPQKlg8ORwxTYnlUwcw0cz1CskJ1phY9zFqbgjh3SvfB8eb7XCrvdndbux+rsaxAt6At6AJQvAR7IIOOAQ9QLzE++X98f76E/+3f+5fzK2+V+W8BrfCv/wHQKDiqw==</latexit> -2 -1 0 1 0.0 0.2 0.4 0.6 0.8 origin contrast Density
  90. e Folk eorem of Statistical Computing “When you have computational

    problems, o en there’s a problem with your model.” Andrew Gelman Spider-Man of Bayesian data analysis
  91. None
  92. Divergent transitions Divergent transition: A kind of rejected proposal Simulation

    diverges from true path Many DTs: poor exploration & possible bias Will discuss again in later lecture
  93. El Pueblo Unido Desktop MCMC has been a revolution in

    scienti c computing Custom scienti c modeling High-dimension Propagate measurement error Sir David Spiegelhalter
  94. Course Schedule Week 1 Bayesian inference Chapters 1, 2, 3

    Week 2 Linear models & Causal Inference Chapter 4 Week 3 Causes, Confounds & Colliders Chapters 5 & 6 Week 4 Over tting / MCMC Chapters 7, 8, 9 Week 5 Generalized Linear Models Chapters 10, 11 Week 6 Integers & Other Monsters Chapters 11 & 12 Week 7 Multilevel models I Chapter 13 Week 8 Multilevel models II Chapter 14 Week 9 Measurement & Missingness Chapter 15 Week 10 Generalized Linear Madness Chapter 16 https://github.com/rmcelreath/stat_rethinking_2023
  95. None