Dive into Probabilistic Programming

Dive into Probabilistic Programming

C6b97a47d5406cfdef50a5c755751c16?s=128

Sorami Hisamoto

September 03, 2016
Tweet

Transcript

  1. Dive into 
 Probabilistic Programming  ֬཰తϓϩάϥϛϯάͲ͏Ͱ͠ΐ͏ !TPSBNJ +VMJB5PLZP 

  2. EJTDMBJNFS *`NKVTUBOPWJDFTMJEFTQSPCBCMZDPOUBJOFSSPST 1MFBTFEPIBWFBMPPLBUUIFSFGFSSFENBUFSJBMT

  3. 1SPCBCJMJTUJD1SPHSBNNJOH ֬཰తϓϩάϥϛϯά

  4. 1SPCBCJMJTUJD1SPHSBNNJOH ֬཰తϓϩάϥϛϯά

  5. https://pycon.jp/2015/ja/schedule/presentation/38/

  6. https://pycon.jp/2015/ja/schedule/presentation/38/

  7. https://pycon.jp/2015/ja/schedule/presentation/38/ ֬཰తʁ֬཰࿦తʁ

  8. 1SPHSBNNJOHlQSPCBCJMJTUJDEJTUSJCVUJPOTzʜ ✦ 1SPCBCJMJTUJD1SPHSBNT  4BNQMJOH
 ESBXWBMVFTBUSBOEPNGSPNEJTUSJCVUJPOT  $POEJUJPOJOH
 DPOEJUJPOWBMVFTPGWBSJBCMFTJOBQSPHSBNWJBPCTFSWBUJPO 5

    ‣ [Gordon+ 2014] "Probabilistic Programming", ICSE Future of Software Engineering ‣ What is probabilistic programming? - The PL Enthusiast ‣ The State of Probabilistic Programming « Some Thoughts on a Mysterious Universe
  9. &YBNQMFqJQQJOHDPJOT ✦ 4BZXFIBWFBDPJO XJUIVOLOPXOCFOEJOFTT ✦ 8FqJQJUBDPVQMFUJNFT ✦ 8FUIFOlHVFTTzJU`TCFOEJOFTT 6 ‣

    [Gordon+ 2014] "Probabilistic Programming", ICSE Future of Software Engineering ‣ What is probabilistic programming? - The PL Enthusiast ‣ “Probabilistic Models of Cognition” Chapter 3 : Conditioning
  10. makeCoin(0.9) #“head”with probability 0.9 makeCoin(0.1) #“tail”with probability 0.9 coin =

    makeCoin(0.5) # distribution flip(coin) # sampling => “head” flip(coin) => “head” flip(coin) => “tail” …
  11. makeCoin(0.9) #“head”with probability 0.9 makeCoin(0.1) #“tail”with probability 0.9 coin =

    makeCoin(0.5) # distribution flip(coin) # sampling => “head” flip(coin) => “head” flip(coin) => “tail” …
  12. makeCoin(0.9) #“head”with probability 0.9 makeCoin(0.1) #“tail”with probability 0.9 coin =

    makeCoin(0.5) # distribution flip(coin) # sampling => “head” flip(coin) => “head” flip(coin) => “tail” …
  13. makeCoin(0.9) #“head”with probability 0.9 makeCoin(0.1) #“tail”with probability 0.9 coin =

    makeCoin(0.5) # distribution flip(coin) # sampling => “head” flip(coin) => “head” flip(coin) => “tail” …
  14. makeCoin(0.9) #“head”with probability 0.9 makeCoin(0.1) #“tail”with probability 0.9 coin =

    makeCoin(0.5) # distribution flip(coin) # sampling => “head” flip(coin) => “head” flip(coin) => “tail” …
  15. coin2 = makeCoin(param) # a coin with unknown parameter result1

    = flip(coin2) observe(result1 == head) # conditioning result2 = flip(coin2) observe(result2 == head) result3 = flip(coin2) observe(result3 == head) infer(param) # => probably higher than 0.5
  16. coin2 = makeCoin(param) # a coin with unknown parameter result1

    = flip(coin2) observe(result1 == head) # conditioning result2 = flip(coin2) observe(result2 == head) result3 = flip(coin2) observe(result3 == head) infer(param) # => probably higher than 0.5
  17. coin2 = makeCoin(param) # a coin with unknown parameter result1

    = flip(coin2) observe(result1 == head) # conditioning result2 = flip(coin2) observe(result2 == head) result3 = flip(coin2) observe(result3 == head) infer(param) # => probably higher than 0.5
  18. coin2 = makeCoin(param) # a coin with unknown parameter result1

    = flip(coin2) observe(result1 == head) # conditioning result2 = flip(coin2) observe(result2 == head) result3 = flip(coin2) observe(result3 == head) infer(param) # => probably higher than 0.5
  19. coin2 = makeCoin(param) # a coin with unknown parameter result1

    = flip(coin2) observe(result1 == head) # conditioning result2 = flip(coin2) observe(result2 == head) result3 = flip(coin2) observe(result3 == head) infer(param) # => probably higher than 0.5
  20. &YBNQMFSBUJOHUIFPOMJOFHBNFQMBZFST ✦ 1MBZFSTIBWFl 5SVF 4LJMMzBOEl1FSGPSNBODFz ✦ 1FSGPSNBODFJTCBTFEPO4LJMMT 
 BOEJUWBSJFTGSPNHBNFUPHBNF ✦

    (BNFPVUDPNFDPOEJUJPO4LJMM 9 ‣ TrueSkill™ Ranking System - Microsoft Research (figure from this page) ‣ TrueSkill — trueskill 0.4.4 documentation ‣ [Gordon+ 2014] "Probabilistic Programming", ICSE Future of Software Engineering
  21. skillA = Normal(μ=100, σ=10) skillB = Normal(μ=100, σ=10) skillC =

    Normal(μ=100, σ=10) ### 1st game : A vs B # players have probabilistic performance perfA1 = Normal(skillA, 15) perfB1 = Normal(skillB, 15) # A won => condition the skills observe(perfA1 > perfB1)
  22. skillA = Normal(μ=100, σ=10) skillB = Normal(μ=100, σ=10) skillC =

    Normal(μ=100, σ=10) ### 1st game : A vs B # players have probabilistic performance perfA1 = Normal(skillA, 15) perfB1 = Normal(skillB, 15) # A won => condition the skills observe(perfA1 > perfB1)
  23. skillA = Normal(μ=100, σ=10) skillB = Normal(μ=100, σ=10) skillC =

    Normal(μ=100, σ=10) ### 1st game : A vs B # players have probabilistic performance perfA1 = Normal(skillA, 15) perfB1 = Normal(skillB, 15) # A won => condition the skills observe(perfA1 > perfB1)
  24. skillA = Normal(μ=100, σ=10) skillB = Normal(μ=100, σ=10) skillC =

    Normal(μ=100, σ=10) ### 1st game : A vs B # players have probabilistic performance perfA1 = Normal(skillA, 15) perfB1 = Normal(skillB, 15) # A won => condition the skills observe(perfA1 > perfB1)
  25. ### 2nd game : B vs C perfB2 = Normal(skillB,

    15) perfC2 = Normal(skillC, 15) observe(perfB2 > perfC2) ### 3rd game : A vs C perfA3 = Normal(skillA, 15) perfC3 = Normal(skillC, 15) observe(perfA3 > perfC3) infer(skillA) # => Normal(μ=105.7, σ=0.11) infer(skillB) # => Normal(μ=100.0, σ=0.11) infer(skillC) # => Normal(μ=94.3, σ=0.11)
  26. ### 2nd game : B vs C perfB2 = Normal(skillB,

    15) perfC2 = Normal(skillC, 15) observe(perfB2 > perfC2) ### 3rd game : A vs C perfA3 = Normal(skillA, 15) perfC3 = Normal(skillC, 15) observe(perfA3 > perfC3) infer(skillA) # => Normal(μ=105.7, σ=0.11) infer(skillB) # => Normal(μ=100.0, σ=0.11) infer(skillC) # => Normal(μ=94.3, σ=0.11)
  27. ### 2nd game : B vs C perfB2 = Normal(skillB,

    15) perfC2 = Normal(skillC, 15) observe(perfB2 > perfC2) ### 3rd game : A vs C perfA3 = Normal(skillA, 15) perfC3 = Normal(skillC, 15) observe(perfA3 > perfC3) infer(skillA) # => Normal(μ=105.7, σ=0.11) infer(skillB) # => Normal(μ=100.0, σ=0.11) infer(skillC) # => Normal(μ=94.3, σ=0.11)
  28. 4JNJMBSFYBNQMF
 UIBUZPVDBOQMBZJUXJUIPOUIFCSPXTFS 1SPCBCJMJTUJD.PEFMTPG$PHOJUJPO$IBQUFS$POEJUJPOJOH 5VHPG8BST

  29. 4JNJMBSFYBNQMF
 UIBUZPVDBOQMBZJUXJUIPOUIFCSPXTFS 1SPCBCJMJTUJD.PEFMTPG$PHOJUJPO$IBQUFS$POEJUJPOJOH 5VHPG8BST

  30. )PXEPXFNBLFUIBUQPTTJCMF ✦ 4UBUJD*OGFSFODF ✦ .FTTBHF1BTTJOH ✦ #FMJFG1SPQBHBUJPO ʜ ✦ %ZOBNJD*OGFSFODF

    ✦ 4BNQMJOH ✦ .$.$ ʜ 13 ‣ [Gordon+ 2014] "Probabilistic Programming", ICSE Future of Software Engineering ‣ The Design and Implementation of Probabilistic Programming Languages ‣ “Probabilistic Models of Cognition” Chapter 7. Algorithms for inference
  31. -BUFOU%JSJDIMFU"MMPDBUJPOJOWBSJPVTGPSNT 14 ‣ [Tang+ 2014] Understanding the Limiting Factors of

    Topic Modelling via Posterior Contraction Analysis // Speaker Deck ‣ example-models/lda.stan at master · stan-dev/example-models ‣ Latent Dirichlet Allocation - Forest
  32. 8IZTIPVMEXFDBSF ✦ 8FDPNNVOJDBUFBCPVUUIFNPEFMTJO
 OBUVSBMMBOHVBHFT NBUIT QMBUFOPUBUJPOT QSPHSBNT ʜ ✦ lVOJGZJOHHFOFSBMQVSQPTFQSPHSBNNJOH


    XJUIQSPCBCJMJTUJDNPEFMJOHz ✦ .BDIJOF-FBSOJOHGPS%PNBJO&YQFSUT
 TDJLJUMFBSOJTDPPM CVUZPVDBO`UDPOTUSVDUOPWFMNPEFMTXJUIJUEJGGFSFOUVTFDBTFT 15 ‣ PROBABILISTIC-PROGRAMMING.org ‣ Why Probabilistic Programming Matters, Beau Cronin, March 2013, (Japanese translation) ‣ The State of Probabilistic Programming « Some Thoughts on a Mysterious Universe
  33. %"31"11".- ✦ l5IF1SPCBCJMJTUJD1SPHSBNNJOHGPS"EWBODJOH.BDIJOF-FBSOJOHz   l4IPSUFS3FEVDF-0$CZYGPS.-BQQMJDBUJPOTz  l'BTUFS3FEVDFEFWFMPQNFOUUJNFCZYz  l.PSF*OGPSNBUJWF%FWFMPQNPEFMTUIBUBSFYNPSFTPQIJTUJDBUFEz

     l8JUI-FTT&YQFSUJTF&OBCMFYNPSFQSPHSBNNFSTz ✦ 4VNNFS4DIPPMT 16 ‣ Probabilistic Programming for Advancing Machine Learning (PPAML), Dr. Suresh Jagannathan ‣ PPAML Wiki ‣ PPAML Kickoff Overview Slides (pdf), November 2013
  34. 14 • Application • Code Libraries • Programming Language •

    Compiler • Hardware The Probabilistic Programming Revolution • Model • Model Libraries • Probabilistic Programming Language • Inference Engine • Hardware Traditional Programming Probabilistic Programming Code models capture how the data was generated using random variables to represent uncertainty Libraries contain common model components: Markov chains, deep belief networks, etc. PPL provides probabilistic primitives & traditional PL constructs so users can express model, queries, and data Inference engine analyzes probabilistic program and chooses appropriate solver(s) for available hardware Hardware can include multi-core, GPU, cloud-based resources, GraphLab, UPSIDE/Analog Logic results, etc. High-level programming languages facilitate building complex systems Probabilistic programming languages facilitate building rich ML applications Approved for Public Release; Distribution Unlimited From PPAML Kickoff Overview Slides (pdf), November 2013
  35. "OEUIFSF`SFTPNBOZ MBOHVBHFTTZTUFNTʜ

  36. #6(4GBNJMZ ✦ #6(4l#BZFTJBOJOGFSFODF6TJOH(JCCT4BNQMJOHz d ✦ 8JO#6(4d 0QFO#6(4d ✦ +"(4l+VTUBOPUIFS(JCCTTBNQMFSz d

    ✦ HPFENBO+BHTKM ✦ PMPGTFO+"(4KM 19 ‣ [Lunn+ 2009] “The BUGS project: Evolution, critique and future directions” ‣ (ja) ୈ1ճBUGS/StanษڧձΛ։࠵͠·ͨ͠ - Analyze IT., September 2013 ‣ (ja) “ؠ೾σʔλαΠΤϯε Vol.1” [ಛू] ϕΠζਪ࿦ͱMCMCͷϑϦʔιϑτ
  37. 45"/ ✦ *NQFSBUJWFQSPCBCJMJTUJDQSPHSBNNJOHMBOHVBHF ✦ /VN'PDVTQSPKFDU ✦ *OUFSGBDFT3 1ZUIPO TIFMM ."5-"#

    4UBUB +VMJB 20 ‣ Stan, official website (logo from this page) ‣ Bob Carpenter: Stan.jl - Statistical Modeling and Inference Made Easy - YouTube, JuliaCon 2015 ‣ Stan is Turing Complete. So what? - Statistical Modeling, Causal Inference, and Social Science
  38. 1Z.$ ✦ 1ZUIPO ✦ 1Z.$ 1Z.$ ✦ l#BZFTJBO.FUIPETGPS)BDLFSTz 21 ‣

    pymc-devs/pymc: PyMC: Bayesian Stochastic Modelling in Python ‣ Probabilistic Programming & Bayesian Methods for Hackers (figure from this page) ‣ (ja) “ؠ೾σʔλαΠΤϯε Vol.1” [ಛू] ϕΠζਪ࿦ͱMCMCͷϑϦʔιϑτ
  39. KB ؠ೾σʔλαΠΤϯε7PM <ಛू>ϕΠζਪ࿦ͱ.$.$ͷϑϦʔιϑτ Image from sites.google.com/site/iwanamidatascience/

  40. $IVSDI ✦ 4DIFNFCBTFE/BNFEBGUFS"MPO[P$IVSDI ✦ (SFBUSFTPVSDFT ✦ l1SPCBCJMJTUJD.PEFMTPG$PHOJUJPOz˒ ✦ l'PSFTUBSFQPTJUPSZGPSHFOFSBUJWFNPEFMTz ✦

    0OMJOFQMBZTQBDF ✦ $PHOJUJWF4DJFODF ✦ $PNQVUBUJPOBOE$PHOJUJPOUIF1SPCBCJMJTUJD"QQSPBDI 1TZDI$4 'BMM 4UBOGPSE$P$P-BC ✦ KB ࢲͷϒοΫϚʔΫܭࢉ࿦తೝ஌Պֶcਓ޻஌ೳֶձ7PM/P  ߴڮୡೋ 23 ‣ Church Wiki ‣ Probabilistic Models of Cognition ‣ Forest - A Repository for Generative Models
  41. probmods.org

  42. forestdb.org

  43. 8FC11- QSPOPVODFEbXFCQFPQMF` ✦ +BWB4DSJQUCBTFE QVSFMZGVODUJPOBM ✦ (SFBUSFTPVSDFT ✦ 5IF%FTJHOBOE*NQMFNFOUBUJPOPG1SPCBCJMJTUJD1SPHSBNNJOH-BOHVBHFT ✦

    11".-4VNNFS4DIPPM ✦ 8FC11-%PDVNFOUBUJPOŠXFCQQMEPDVNFOUBUJPO ✦ .PEFMJOH"HFOUTXJUI1SPCBCJMJTUJD1SPHSBNT 26
  44. #BZFT%#7FOUVSF ✦ 5IF.*51SPCBCJMJTUJD$PNQVUJOH1SPKFDU ✦ #BZFT%# ✦ #2- #BZFTJBO2VFSZ-BOVBHF FYUFOTJPOPG42-&45*."5& 4*.6-"5&

    ʜ ✦ 7FOUVSF ✦ QSPCBCJMJTUJDDPNQVUJOHQMBUGPSN
 )PTUTBQQMJDBUJPOTMJLF#BZFT%#1SPHSBNNFEQSJNBSJMZJOl7FOUVSF4DSJQUz 27 ‣ The MIT Probabilistic Computing Project ‣ "An Overview of Probabilistic Programming" by Vikash K. Mansinghka - YouTube, Strange Loop 2015 ‣ MIT Media Lab | Vikash Mansinghka on Probabilistic Programming for Augmented Intelligence, 2016
  45. *OGFS/&5'VO5BCVMBS ✦ .JDSPTPGU3TFBSDI ✦ *OGFS/&5'VO ✦ 'QSPCBCJMJTUJDNPEFMJOHMBOHVBHF ✦ 5BCVMBSTDIFNBESJWFOBQQSPBDI ✦

    BWBJMBCMFBTBO&YDFMBEEJO 28 ‣ ETAPS 2016 - Structure and Interpretation of Probabilistic Programs - Andrew D. Gordon - YouTube ‣ Infer.NET Fun - Microsoft Research ‣ [Gordon+ 2014] "Probabilistic Programming", ICSE Future of Software Engineering (figure from this paper)
  46. 'JHBSP ✦ 4DBMBCBTFE0CKFDU0SJFOUFE ✦ 5IFSF`TBCPPL  29 ‣ Avi Pfeffer

    “Practical Probabilistic Programming” (Manning, 2016) (figure from this page) ‣ Probabilistic Programming Language | Probabilistic Modeling ‣ Avi Pfeffer - Practical Probabilistic Programming with Figaro - MLconf SEA 2016 - YouTube, slide
  47. "OHMJDBOJOUFHSBUFEXJUI$MPKVSF .-445VUPSJBM1SPCBCJMJTUJD$ &EXBSEl"MJCSBSZGPSQSPCBCJMJTUJDNPEFMJOH JOGFSFODF BOEDSJUJDJTNz 1ZUIPO5FOTPS'MPX 4UBO 1Z.$ ✦ l&YJTUJOHQSPCBCJMJTUJDQSPHSBNNJOHTZTUFNTz130#"#*-*45*$130(3"..*/(PSH

    ✦ l-JTUPGQSPCBCJMJTUJDQSPHSBNNJOHMBOHVBHFTz1SPCBCJMJTUJDQSPHSBNNJOHMBOHVBHF8JLJQFEJB UIFGSFFFODZDMPQFEJB l7FOUVSF .*5 'JHBSP $IBSMFT3JWFS"OBMZUJDT BOE#-0( #FSLFMFZ XFSFHJWFOUIFMJPO`TTIBSFPGBUUFOUJPOEVFUPUIFJS SFMBUJWF  NBUVSJUZ CVUPUIFSMBOHVBHFTQSFTFOUJODMVEFE$IVSDI 4UBOGPSE )BSBLV 6OJWFSTJUZPG*OEJBOB BOE$IJNQMF%JNQMF (BNBMPO 5IJT JTCZOPNFBOTBGVMMBDDPVOUJOHPGUIFTQBDFPG114T XIJDIJODMVEFT4UBO $PMVNCJB *OGFS/&5 .JDSPTPGU BOEWFSZNBOZPUIFSTz
 RVPUFGSPN5IF4UBUFPG1SPCBCJMJTUJD1SPHSBNNJOHm4PNF5IPVHIUTPOB.ZTUFSJPVT6OJWFSTF FNQIBTJTBEEFE BOENPSFʜ 30 ‣ http://www.robots.ox.ac.uk/~fwood/anglican/ (figure from this page) ‣ http://edwardlib.org/ (figure from this page)
  48. +VMJB

  49. +VMJB4UBUT ✦ 4FUPGQBDLBHFT$PNNVOJUZ ✦ l4UBUJTUJDTBOE.BDIJOF-FBSOJOHNBEFFBTZJO+VMJBz ✦ %JTUSJCVUJPOTKMTBNQMJOH ʜ ✦ "UISFBEPO+VMJB4UBUTNBJMJOHMJTU


    1SPCBCJMJTUJD1SPHSBNNJOHJO+VMJB(PPHMF(SPVQT 32 ‣ https://github.com/JuliaStats (figure from this page) ‣ JuliaStats.org ‣ mailing list : julia-stats - Google Groups
  50. .$.$ ✦ +VMJB4UBUT.$.$KMDVSSFOUMZBQMBDFIPMEFSSFQPTJUPSZ ✦ +VMJB4UBUT-PSBKM ✦ CSJBOKTNJUI.BNCBKM ✦ HPFENBO4UBOKM ✦

    HPFENBO+BHTKM ✦ PMPGTFO+"(4KM ✦ -BVSFODF"TEpPVTEGKM QSFWJPVTMZLOPXOBT5VSJOHKM 33 ‣ scidom (Theodore Papamarkou) (figure from this page), Lora User Guide — Lora 0.5.3 documentation ‣ Names of Existing MCMC Packages · Issue #1 · JuliaStats/MCMC.jl ‣ JuliaTokyo #2 ͰϞϯςΧϧϩ๏ʹ͍ͭͯ࿩͖ͯ͠·ͨ͠ - ΓΜ͕͝Ͱ͍ͯΔ
  51. 4JHNBKM ✦ 1SPCBCJMJTUJDQSPHSBNNJOHFOWJSPONFOUJO+VMJB ✦ 5BMLTJO+VMJB$PO 4USBOHF-PPQ ✦ l+VMJBBTB1SPCBCJMJTUJD1SPHSBNNJOH-BOHVBHFzCZ;FOOB5BWBSFT OPWJEFP 

    ✦ l1SPCBCJMJTUJD1SPHSBNT8IJDI.BLF $PNNPO 4FOTFzCZ;FOOB5BWBSFT :PV5VCF 34 ‣ https://github.com/zenna/Sigma.jl ‣ Sigma.jl’s documentation — Sigma 0.0.1 documentation ‣ "Probabilistic Programs Which Make (Common) Sense" by Zenna Tavares - YouTube, Strange Loop 2015
  52. TPNFNPSF ✦ ZFCBJ5VSJOHKM ✦ l3FKVWFOBUJOHQSPCBCJMJTUJDQSPHSBNNJOHJO+VMJBz ✦ 4UJMMZPVOHJOJUJBMDPNNJUJO"QSJM ✦ (F )POH

    "EBN4DJCJPS BOE;PVCJO(IBISBNBOJ5VSJOHSFKVWFOBUJOHQSPCBCJMJTUJDQSPHSBNNJOHJO+VMJB *OTVCNJTTJPO  ✦ OVMMB4UPDIZKMzOPMPOHFSVOEFSEFWFMPQNFOUz ✦ -BVSFODF"$IVSDIKMPCTPMFUF  ✦ ʜ 35 ‣ The Turing Language — Turing.jl 0.0.1 documentation ‣ Julia.jl/Statistics.md at master · svaksha/Julia.jl
  53. 1JDUVSF ✦ "1SPCBCJMJTUJD1SPHSBNNJOH-BOHVBHFGPS4DFOF1FSDFQUJPO ✦ #BTFEPO+VMJB ✦ .*51SPC$PNQ1SPKFDU 7FOUVSF #BZFT%# ʜ

     ✦ <,VMLBSOJ >l1JDUVSFBQSPCBCJMJTUJDQSPHSBNNJOHMBOHVBHFGPSTDFOFQFSDFQUJPOz
 $713 #FTU1BQFS)POPSBCMF.FOUJPO  ✦ l1SPCBCJMJTUJDQSPHSBNNJOHEPFTJOMJOFTPGDPEFXIBUVTFEUPUBLFUIPVTBOETz 36 ‣ Picture: A Probabilistic Programming Language for Scene Perception (figure from this page) ‣ Short probabilistic programming machine-learning code replaces complex programs for computer-vision tasks | KurzweilAI ‣ Picture: A Probabilistic Programming Language for Scene Perception - TechTalks.tv, prezi slide
  54. 0QFO11-
 "1SPQPTBMGPS1SPCBCJMJTUJD1SPHSBNNJOHJO+VMJB 37 ‣ OpenPPL: A Proposal for Probabilistic Programming

    in Julia — OpenPPL 0.1.0 documentation ‣ JuliaStats/PGM.jl: A Julia framework for probabilistic graphical models. ‣ Probabilistic Programming in Julia - Google Groups
  55. 8IFSFUPTUBSU

  56. 8IFSFUPTUBSU ✦ 3FBEJOHT ✦ 8IBUJTQSPCBCJMJTUJDQSPHSBNNJOH 5IF1-&OUIVTJBTU ✦ 5IF4UBUFPG1SPCBCJMJTUJD1SPHSBNNJOHm4PNF5IPVHIUTPOB.ZTUFSJPVT6OJWFSTF ✦ *OUFSBDUJWF5VUPSJBMT

    ✦ 1SPCBCJMJTUJD.PEFMTPG$PHOJUJPO $IVSDI  ✦ 5IF%FTJHOBOE*NQMFNFOUBUJPOPG1SPCBCJMJTUJD1SPHSBNNJOH-BOHVBHFT 8FC11- 39 ‣ PROBABILISTIC-PROGRAMMING.org ‣ Resources – PPAML
  57. 4PNFQBQFST ✦ <(PSEPO >1SPCBCJMJTUJD1SPHSBNNJOHz
 *$4&'VUVSFPG4PGUXBSF&OHJOFFSJOH ✦ <(PPENBO >l$IVSDIBMBOHVBHFGPSHFOFSBUJWFNPEFMTz
 6ODFSUBJOUZJO"SUJpDJBM*OUFMMJHFODF 6"*

     ✦ 1I%5IFTFT ✦ "OESFBT4UVIMNÛMMFS l.PEFMJOH$PHOJUJPOXJUI1SPCBCJMJTUJD1SPHSBNT3FQSFTFOUBUJPOTBOE"MHPSJUINTz ✦ %BOJFM.3PZ l$PNQVUBCJMJUZ JOGFSFODFBOENPEFMJOHJOQSPCBCJMJTUJDQSPHSBNNJOHz ✦ 7JLBTI.BOTJOHILB l/BUJWFMZ1SPCBCJMJTUJD$PNQVUBUJPOz 40 ‣ Research articles on probabilistic programming: PROBABILISTIC-PROGRAMMING.org