Compositional Creativity (ICFP 2017)

B6ff5f798c18a3367b2770aa3ada0730?s=47 Chris
September 04, 2017

Compositional Creativity (ICFP 2017)

B6ff5f798c18a3367b2770aa3ada0730?s=128

Chris

September 04, 2017
Tweet

Transcript

  1. ICFP 2017, Oxford, UK Compositional Creativity Chris Martens Assistant Professor

    Computer Science Department North Carolina State University
  2. About Me Assistant Professor (UK equiv.: Lecturer) at NCSU Postdoc

    UC Santa Cruz, Expressive Intelligence Studio PhD CMU 2015, POP group 2
  3. About Me Conferences I used to go to had words

    like: functional, types, logic, deduction, languages 3 Conferences I go to now have words like: artificial intelligence, narrative, interactive storytelling, games, play
  4. My Lab Conferences I used to go to had words

    like: functional, types, logic, deduction, languages 4 Conferences I go to now have words like: artificial intelligence, narrative, interactive storytelling, games, play
  5. Research Mission Use principles from logic and formal methods to

    enable more expressive human-computer collaboration 5
  6. Human-Computer Collaboration ❖ Interactive storytelling ❖ Intelligent tutors ❖ Conversational

    agents ❖ Mixed-initiative creative tools 6
  7. “Artificial Intelligence”? ❖ Interactive storytelling ❖ Intelligent tutors ❖ Conversational

    agents ❖ Mixed-initiative creative tools 7
  8. Unifying themes: logic ❖ Logic as an intermediate language between

    human and computational thought ❖ Reasoning about synthesized behavior 8
  9. Unifying themes: social reasoning ❖ Simulating social agents ❖ Designing

    languages and other human-computer interfaces through a metaphor of conversation 9
  10. This talk From classical AI to modern FP Plots as

    plans as proofs Intentional reasoning Epistemic reasoning 10
  11. From classical AI to modern FP From classical AI to

    modern FP Plots as plans as proofs Intentional reasoning Epistemic reasoning 11
  12. A common origin of FP and AI, 1956 12

  13. Influence on AI First system to codify 
 reasoning as

    search Presented at McCarthy’s AI-defining Dartmouth Research Project, Summer 1956 AI historian Pamela McCormick: “proof positive that a machine could perform tasks heretofore considered intelligent, creative and uniquely human” 13
  14. Influence on FP To build LTM, they implemented a new

    language called IPL Influenced McCarthy’s design of LISP Use of lists rather than arrays Hierarchical subroutines and recursion 14
  15. “Current Developments in Complex Information Processing” Allen Newell and Herbert

    Simon, 1959 15
  16. Reasoning as Search: 3 examples * Theorem proving * Program

    synthesis * AI planning 16
  17. Reasoning as Search: 3 examples * Theorem proving * Program

    synthesis * AI planning 17
  18. Theorem Proving Γ ⊢A 18 Initial Assumptions Goal Defined by

    atomic inference rules
  19. Theorem Proving Γ ⊢A 19 Γ ⊢A Composition of inferences

  20. Reasoning as Search: 3 examples * Theorem proving * Program

    synthesis * AI planning 20
  21. Reasoning as Search: 3 examples * Theorem proving * Program

    synthesis * AI planning 21 Conor McBride (since at least 2013): Type inference is so last century! Let’s write more detailed types, do program inference!
  22. (Type-based) Program Synthesis Γ ⊢A 22 Library functions Specification Typing

    rules of the language
  23. (Type-based) Program Synthesis Γ ⊢A :A let rec reverse l

    = case l of nil => … | cons x::xs => … Osera and Pierce - “Type-and-example-directed program synthesis,” POPL 2015; Polikarpova et al., “Program synthesis from polymorphic refinement types,” POPL 2016 23 Composition of program constructs
  24. Reasoning as Search: 3 examples * Theorem proving * Program

    synthesis * AI planning 24
  25. AI Behavior Synthesis (Planning) Γ ⊢A 25 Initial State Goal

    state Atomic operators that can change the state
  26. AI Behavior Synthesis (Planning) Γ ⊢A move a1 a2 pickup

    block1 move a2 b2 place block1 : { at(bot, a1), at(block1, a2)} —-> {at(bot, b2), at(block1, b2)} 26 Plan (sequence of operations)
  27. Creative machines? * Theorem proving * Program synthesis * AI

    planning 27
  28. Plots as plans as proofs From classical AI to modern

    FP Plots as plans as proofs Intentional reasoning Epistemic reasoning 28
  29. AI Planning Domain - sets of operators (parameterized actions) Problem

    - initial and goal configuration —PLANNER—> plan, i.e.
 sequence of operators applied to terms to reach goal from initial 29
  30. AI Planning Domains PDDL/STRIPS - domain specific langs. for planners

    Actions defined in terms of of preconditions and effects Preconditions, effects, configurations, are collections of logical predicates 30
  31. Story Generation Planning for Plot Generation
 (Young 1999; Cavazza and

    Charles 2001-2) Linear Logic Programming for Plot Generation
 (Martens, Ferreira, Bosser, Cavazza 2013) 31
  32. Story Configurations initial: { dead(jon_arryn, hand),
 role(robert_baratheon, king),
 heir(robert_baratheon, joffrey),


    allies(robert_baratheon, eddard_stark),
 role(eddard_stark, warden_of_north) } sets of atomic propositions describing the world (Game-of-Thrones-inspired example) 32 ! " ❄ (
  33. Story Operators narrative events, e.g. ways characters can bid for

    power name_hand(Ruler, NewHand) :
 preconditions: need_filled(hand),
 role(Ruler, ruler_of_westeros),
 allies(Ruler, NewHand) effects: offered_role(NewHand, hand)
 delete need_filled(hand) 33
  34. Story Operators As a linear logic formula: name_hand(Ruler, NewHand) :

    need_filled(hand) *
 has_role(Ruler, ruler_of_westeros) *
 allies(Ruler, NewHand) -o offered_role(NewHand, hand) *
 has_role(Ruler, ruler_of_westeros) *
 allies(Ruler, NewHand) 34
  35. Compositional notation for plots 35 " ! ❄ " !

    ( "
  36. 36 " names ! " ! ❄ " ! (

    " " " ! ! Compositional notation for plots
  37. ! observes ( 37 " names ! " ! ❄

    " ! " " ! ( " ! … Compositional notation for plots
  38. ! observes ( 38 " names ! " ! ❄

    " ! " " ! ( " ! … … … Compositional notation for plots
  39. SEASON 1
 EPISODE 2 39 " ! ❄ " !

    ( " … … … Compositional notation for plots
  40. SEASON 1
 EPISODE 2 40 " ! ❄ " !

    ( " … … … Compositional notation for plots c.f. symmetric monoidal categories
  41. Story Generation goal: 
 has_role(!, ) 41 Searching for plots

    with specified endings
  42. Example: Story Generation Output: computer-generated fan fiction 42

  43. Narrative variation? goal: 
 Exists C. 
 has_role(C, ruler_of_westeros) 43

    Searching for plots with varied endings
  44. Planners & theorem provers find the most efficient solution 


    
 : Exists C. has_role(C, ruler_of_westeros) 44 Narrative variation?
  45. Planners & theorem provers find the most efficient solution has_role(rb,

    ruler_of_westeros) |- 
 
 : Exists C. has_role(C, ruler_of_westeros) 45 Narrative variation?
  46. Planners & theorem provers find the most efficient solution has_role(rb,

    ruler_of_westeros) |- 
 
 “The end.” : Exists C. has_role(C, ruler_of_westeros) 46 Narrative variation?
  47. Planners & theorem provers find the most efficient solution has_role(rb,

    ruler_of_westeros) |- 
 
 “The end.” : Exists C. has_role(C, ruler_of_westeros) Stories are anti-efficient! 47 Narrative variation?
  48. A Partial Solution: Nondeterministic Proof Construction Linear logic for story

    generation: Martens et al. 2013 (LPNMR) Use forward chaining nondeterminism to leave goals unspecified,
 randomly explore the search space 48 Sacrifice completeness of search for variation in outcome
  49. Another Limitation Stories generate interest e.g. through characters with conflicting

    goals Subterfuge, betrayal, suspense, etc. all depend on characters acting on goals with limited knowledge 49
  50. Meanwhile, in program synthesis… 50

  51. Adding constraints Program synthesis:
 Refining types with input-output examples (test

    cases)
 
 51 f : int -> nat
 | f 1 => 1
 f 2 => 4
 f 5 => 25
  52. Adding constraints Story generation: Enforcing intentional behavior in characters 52

  53. Intentional reasoning From classical AI to modern FP Plots as

    plans as proofs Intentional reasoning Epistemic reasoning 53
  54. Game AI: from opponents to cooperators Computer opponents (e.g. Chess,

    Go):
 moves need not be intelligible to human Computer cooperators:
 need to move in a legible, intentional-seeming way 54
  55. Grice's Maxims of Conversation ❖ Quantity - say neither too

    much nor too little ❖ Quality - tell the truth ❖ Relation - what is said should be relevant to the topic/stated intentions ❖ Manner - avoid obscurity and ambiguity 55 (Paul Grice, “Logic and Conversation,” 1975)
  56. Hanabi Your hand is visible only to others Your shared

    goal is to build sequentially- ordered stacks of like- color cards 56
  57. Hanabi One turn: Play, Discard, or Give hint (spends a

    token) 57
  58. Hanabi: hints 58

  59. Computational cooperation in Hanabi ❖ Model the player’s knowledge, including

    their model of our knowledge ❖ Give hints to human players that will be recognized as intentional: point out cards when unambiguous whether to play or discard
 
 59
  60. Computational cooperation in Hanabi ❖ Model the player’s knowledge, including

    their model of our knowledge ❖ Give hints to human players that will be recognized as intentional: point out cards when unambiguous whether to play or discard
 
 60 }“Intentional”
 model
  61. Computational cooperation in Hanabi ❖ Model the player’s knowledge, including

    their model of our knowledge ❖ Give hints to human players that will be recognized as intentional: point out cards when unambiguous whether to play or discard ❖ Interpret hints from human players as carrying the same intent; play or discard appropriately 61 }“Intentional”
 model { “Full”
 model
  62. Computational Intelligence in Games (CIG) 2017 3 implementations (base, intentional,

    full) + experimental evaluation 1. Simulation of computer:computer 2. Human subjects study, human:computer 62
  63. Computational Intelligence in Games (CIG) 2017 63

  64. Computational Intelligence in Games (CIG) 2017 64 Players rated more

    intentional agents both more enjoyable and more intentional-seeming
  65. Grice's Maxims of Conversation Quantity - say neither too much

    nor too little ❖ Enforced by game rules 65
  66. Grice's Maxims of Conversation Quality - tell the truth ❖

    Players have shared goals 66
  67. Grice's Maxims of Conversation Relation - be relevant ❖ Only

    give hints that can be interpreted as play/discard instructions 67
  68. Grice's Maxims of Conversation Manner - avoid ambiguity ❖ Only

    hint sets of cards for which the actionable subset is unambiguous 68
  69. Limitation Only one step of look-ahead How to apply intentional

    reasoning to search? 69
  70. Intentionality depends on Belief Transition from ad-hoc encoding of knowledge

    to formal Describe how beliefs influence actions and vice versa 70
  71. Epistemic reasoning From classical AI to modern FP Plots as

    plans as proofs Intentional reasoning Epistemic reasoning 71
  72. Theory of Mind Forming mental models of other people’s mental

    models 72
  73. Theory of Mind Forming mental models of other people’s mental

    models A key form of social reasoning in humans 73
  74. 3 Logicians Walk into a Pub 74 Will everyone have

    a beer?
  75. 3 Logicians Walk into a Pub 75 Will everyone have

    a beer? I don’t know
  76. 3 Logicians Walk into a Pub 76 Will everyone have

    a beer? I don’t knowI don’t know
  77. 3 Logicians Walk into a Pub 77 Will everyone have

    a beer? I don’t knowI don’t know Yes!
  78. Agent a believes A
 Epistemic Logic ☐a A
 78

  79. Dynamic Epistemic Logic 79 ☐a A
 [α]A Agent a believes

    A
 A holds under action α
  80. Agent a believes A
 A holds under action α Dynamic

    Epistemic Logic ☐a A
 [α]A 80 Actions may change the 
 actual world or the
 set of worlds an agent considers possible
  81. Epistemic Actions (Baltag 2002) α, β ::= flip p 


    | ?A 
 | α + β 
 | α ; β 
 Change truth value
 Precondition
 Nondeterministic choice
 Sequence
 81
  82. Epistemic Actions (Baltag 2002) α, β ::= flip p 


    | ?A 
 | α + β 
 | α ; β 
 | αa 
 | α* Change truth value
 Precondition
 Nondeterministic choice
 Sequence
 Appearance to a
 Public action 82
  83. Semantics: Possible Worlds Epistemic states: sets of world each agent

    considers possible 83
  84. Semantics: Possible Worlds An actual world 84 Set of world

    an agent considers
 possible
  85. Semantics: Possible Worlds These sets can expand and contract based

    on actions 85
  86. Semantics: Possible Worlds 86 α Apply α

  87. α α α α Semantics: Possible Worlds 87 α Apply

    α
  88. Semantics: Possible Worlds 88 Apply α + β

  89. Semantics: Possible Worlds 89 α Apply α + β β

    or α α α α β β β β
  90. Semantics: Possible Worlds 90 Apply ?( v ); α

  91. Semantics: Possible Worlds 91 α Apply ?( v ); α

    α α
  92. –Petyr Baelish “Fight every battle everywhere, always, in your mind.

    Everyone is your enemy, everyone is your friend. Every possible series of events is happening all at once. Live that way and nothing will surprise you. Everything that happens will be something that you’ve seen before.” 92
  93. A tiny example Alice flips a coin in secret;
 Bob

    and Carol observe 93 coin(heads)Alice + coin(tails)Alice; 
 (coin(heads) + coin(tails)){Bob, Carol}
  94. A practical action language? 94 Aggregate sums/products for expressing more

    than Boolean changes
 Term size blowup
  95. Ostari - Eger & Martens 2017 Parameterized actions
 Parameters can

    be secret to specific sets of agents 95 ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩) <cmd>
  96. Ostari - Eger & Martens 2017 <cmd> ::=
 <property> :=

    <value>
 | learn <agents> <fact>
 | public <cmd>
 | <cmd> ; <cmd> 96 ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩) <cmd>
  97. Ostari - Eger & Martens 2017 <cmd> ::=
 <property> :=

    <value>
 | learn <agents> <fact>
 | public <cmd>
 | <cmd> ; <cmd> 97 ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩) <cmd>
  98. Ostari - Eger & Martens 2017 <cmd> ::=
 <property> :=

    <value>
 | learn <agents> <fact>
 | public <cmd>
 | <cmd> ; <cmd> <fact> ::=
 exists x.<cond>
 | all x.<cond>
 | each x.<cond>
 | which x.<cond> 98 ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩) <cmd>
  99. Ostari - Eger & Martens 2017 <cmd> ::=
 <property> :=

    <value>
 | learn <agents> <fact>
 | public <cmd>
 | <cmd> ; <cmd> <fact> ::=
 exists x.<cond>
 | all x.<cond>
 | each x.<cond>
 | which x.<cond> 99 ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩) <cmd>
  100. Epistemic Quantifiers 100 learn (p): exists c in Cards: at(deck,

    1) == c. Useless knowledge
  101. Epistemic Quantifiers 101 learn (p): which c in Cards: at(deck,

    1) == c. Useful knowledge
  102. Example 102 hintcolor(p: Players, c: Colors) learn (p): Each i

    in HandIndex: color(at(p, i)) == c Hinting a color in Hanabi
  103. Implementation in Haskell
 (laziness useful for possible worlds representation!) Sequences

    of actions and queries Goals that generate action sequences Ostari - Eger & Martens 2017 103
  104. Epistemic Narrative Generation WorldA:
 at(Sherlock) = BakerStreet
 looks like (Sherlock):

    WorldA
 looks like (Moriarty): WorldA, WorldB WorldB:
 at(Sherlock) = ScotlandYard
 looks like (Sherlock): WorldB
 looks like (Moriarty): WorldA, WorldB 104
  105. Epistemic Narrative Generation goal: 
 murdered(Moriarty, Victim) ^
 ☐(Moriarty): ☐(Sherlock):

    exists M. 
 M != Moriarty ^ murdered(M, Victim) 105
  106. Epistemic Narrative Generation Generates a story in which Moriarty frames

    someone for murder goal: 
 murdered(Moriarty, Victim) ^
 ☐(Moriarty): ☐(Sherlock): exists M. 
 M != Moriarty ^ murdered(M, Victim) 106
  107. Summary 3 investigations into using logic for creativity: 1. Generating

    narrative with planning/linear logic 2. Intentional reasoning in the context of cooperation 3. Codifying intentionality with Epistemic Logic 107
  108. From modern FP to future AI From classical AI to

    modern FP Plots as plans as proofs Epistemic reasoning Intentional reasoning From modern FP to future AI 108
  109. Compositionality FP’s secret weapon [A ∘ B] == [A] [∘]

    [B] 109
  110. Compositionality 110 x f g z y y Function composition

    (meaning [-] is interface)
  111. f ∘ g Compositionality Function composition (meaning [-] is interface)

    111 x z
  112. Compositionality Proof composition (use of lemmas) 112 C A ⊢

    Δ, ⊢ A
  113. Compositionality 113 , C Δ ⊢ Proof composition (use of

    lemmas)
  114. Compositionality In planning: Frame Principle (“horizontal” composition) 114 Δ Δ’

    π ’ ρ
  115. Compositionality In planning: Frame Principle (“horizontal” composition) 115 Δ Δ’

    π
 ||
 ρ ’
  116. Compositionality In planning linear logic: Frame Principle (“horizontal” composition) 116

    Δ Δ’ ’ π
 ||
 ρ
  117. Research Challenges Dynamic logics with functional action languages Constructive/lambda-calculable DEL

    Epistemic session types 117
  118. Compositional Creativity? asking computers to generate 
 insights, behaviors, narrative


    based on logical specifications 118
  119. Why Games and Stories? Logic is a great intermediate language

    between humans and machines, but insufficient Humans have been communicating through play and storytelling for even longer than they have been doing logic The best abstractions are discovered in diverse disciplines! 119
  120. Thanks! Chris Martens http://go.ncsu.edu/martens martens@csc.ncsu.edu @chrisamaphone github.com/chrisamaphone 120