Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Compositional Creativity (ICFP 2017)

Chris
September 04, 2017

Compositional Creativity (ICFP 2017)

Chris

September 04, 2017
Tweet

More Decks by Chris

Other Decks in Research

Transcript

  1. ICFP 2017, Oxford, UK
    Compositional
    Creativity
    Chris Martens
    Assistant Professor
    Computer Science Department
    North Carolina State University

    View Slide

  2. About Me
    Assistant Professor (UK equiv.: Lecturer) at NCSU
    Postdoc UC Santa Cruz, Expressive Intelligence Studio
    PhD CMU 2015, POP group
    2

    View Slide

  3. About Me
    Conferences I used to
    go to had words like:
    functional, types,
    logic, deduction,
    languages
    3
    Conferences I go to
    now have words like:
    artificial intelligence,
    narrative, interactive
    storytelling, games,
    play

    View Slide

  4. My Lab
    Conferences I used to
    go to had words like:
    functional, types,
    logic, deduction,
    languages
    4
    Conferences I go to
    now have words like:
    artificial intelligence,
    narrative, interactive
    storytelling, games,
    play

    View Slide

  5. Research Mission
    Use principles from logic and
    formal methods to enable more
    expressive human-computer
    collaboration
    5

    View Slide

  6. Human-Computer Collaboration
    ❖ Interactive storytelling
    ❖ Intelligent tutors
    ❖ Conversational agents
    ❖ Mixed-initiative creative tools
    6

    View Slide

  7. “Artificial Intelligence”?
    ❖ Interactive storytelling
    ❖ Intelligent tutors
    ❖ Conversational agents
    ❖ Mixed-initiative creative tools
    7

    View Slide

  8. Unifying themes: logic
    ❖ Logic as an intermediate
    language between human and
    computational thought
    ❖ Reasoning about synthesized
    behavior
    8

    View Slide

  9. Unifying themes: social reasoning
    ❖ Simulating social agents
    ❖ Designing languages and other
    human-computer interfaces
    through a metaphor of
    conversation
    9

    View Slide

  10. This talk
    From classical AI to modern FP
    Plots as plans as proofs
    Intentional reasoning
    Epistemic reasoning
    10

    View Slide

  11. From classical AI to modern FP
    From classical AI to modern FP
    Plots as plans as proofs
    Intentional reasoning
    Epistemic reasoning
    11

    View Slide

  12. A common origin of FP and AI, 1956
    12

    View Slide

  13. Influence on AI
    First system to codify 

    reasoning as search
    Presented at McCarthy’s AI-defining Dartmouth Research Project, Summer 1956
    AI historian Pamela McCormick: “proof positive that a machine could perform
    tasks heretofore considered intelligent, creative and uniquely human”
    13

    View Slide

  14. Influence on FP
    To build LTM, they implemented a new language called IPL
    Influenced McCarthy’s design of LISP
    Use of lists rather than arrays
    Hierarchical subroutines and recursion
    14

    View Slide

  15. “Current Developments in
    Complex Information Processing”
    Allen Newell and Herbert Simon, 1959
    15

    View Slide

  16. Reasoning as Search: 3 examples
    * Theorem proving
    * Program synthesis
    * AI planning
    16

    View Slide

  17. Reasoning as Search: 3 examples
    * Theorem proving
    * Program synthesis
    * AI planning
    17

    View Slide

  18. Theorem Proving
    Γ ⊢A
    18
    Initial
    Assumptions
    Goal
    Defined by atomic
    inference rules

    View Slide

  19. Theorem Proving
    Γ ⊢A
    19
    Γ ⊢A
    Composition of inferences

    View Slide

  20. Reasoning as Search: 3 examples
    * Theorem proving
    * Program synthesis
    * AI planning
    20

    View Slide

  21. Reasoning as Search: 3 examples
    * Theorem proving
    * Program synthesis
    * AI planning
    21
    Conor McBride (since at least 2013):
    Type inference is so last century!
    Let’s write more detailed types, do program inference!

    View Slide

  22. (Type-based) Program Synthesis
    Γ ⊢A
    22
    Library
    functions Specification
    Typing rules of the language

    View Slide

  23. (Type-based) Program Synthesis
    Γ ⊢A
    :A
    let rec reverse l =
    case l of nil => …
    | cons x::xs => …
    Osera and Pierce - “Type-and-example-directed program
    synthesis,” POPL 2015;
    Polikarpova et al., “Program synthesis from polymorphic
    refinement types,” POPL 2016
    23
    Composition of program
    constructs

    View Slide

  24. Reasoning as Search: 3 examples
    * Theorem proving
    * Program synthesis
    * AI planning
    24

    View Slide

  25. AI Behavior Synthesis (Planning)
    Γ ⊢A
    25
    Initial State
    Goal state
    Atomic operators that can
    change the state

    View Slide

  26. AI Behavior Synthesis (Planning)
    Γ ⊢A move a1 a2
    pickup block1
    move a2 b2
    place block1
    : { at(bot, a1), at(block1, a2)}
    —-> {at(bot, b2), at(block1, b2)}
    26
    Plan (sequence of operations)

    View Slide

  27. Creative machines?
    * Theorem proving
    * Program synthesis
    * AI planning
    27

    View Slide

  28. Plots as plans as proofs
    From classical AI to modern FP
    Plots as plans as proofs
    Intentional reasoning
    Epistemic reasoning
    28

    View Slide

  29. AI Planning
    Domain - sets of operators (parameterized actions)
    Problem - initial and goal configuration
    —PLANNER—>
    plan, i.e.

    sequence of operators applied to terms to reach goal from initial
    29

    View Slide

  30. AI Planning Domains
    PDDL/STRIPS - domain specific langs. for planners
    Actions defined in terms of of preconditions and effects
    Preconditions, effects, configurations, are collections of
    logical predicates
    30

    View Slide

  31. Story Generation
    Planning for Plot Generation

    (Young 1999; Cavazza and Charles 2001-2)
    Linear Logic Programming for Plot Generation

    (Martens, Ferreira, Bosser, Cavazza 2013)
    31

    View Slide

  32. Story Configurations
    initial: {
    dead(jon_arryn, hand),

    role(robert_baratheon, king),

    heir(robert_baratheon, joffrey),

    allies(robert_baratheon, eddard_stark),

    role(eddard_stark, warden_of_north)
    }
    sets of atomic propositions describing the world
    (Game-of-Thrones-inspired example)
    32
    ! "





    (

    View Slide

  33. Story Operators
    narrative events, e.g. ways characters can bid for power
    name_hand(Ruler, NewHand) :

    preconditions:
    need_filled(hand),

    role(Ruler, ruler_of_westeros),

    allies(Ruler, NewHand)
    effects:
    offered_role(NewHand, hand)

    delete need_filled(hand)
    33

    View Slide

  34. Story Operators
    As a linear logic formula:
    name_hand(Ruler, NewHand) :
    need_filled(hand) *

    has_role(Ruler, ruler_of_westeros) *

    allies(Ruler, NewHand)
    -o
    offered_role(NewHand, hand) *

    has_role(Ruler, ruler_of_westeros) *

    allies(Ruler, NewHand)
    34

    View Slide

  35. Compositional notation for plots
    35
    "

    !



    " !
    (

    "

    View Slide

  36. 36
    " names !
    "

    !



    " !
    (

    "
    "


    " !
    !
    Compositional notation for plots

    View Slide

  37. ! observes (
    37
    " names !
    "

    !



    " !
    "


    " !
    (

    "
    ! …
    Compositional notation for plots

    View Slide

  38. ! observes (
    38
    " names !
    "

    !



    " !
    "


    " !
    (

    "
    ! …


    Compositional notation for plots

    View Slide

  39. SEASON 1

    EPISODE 2
    39
    "

    !



    " !
    (

    "



    Compositional notation for plots

    View Slide

  40. SEASON 1

    EPISODE 2
    40
    "

    !



    " !
    (

    "



    Compositional notation for plots
    c.f. symmetric monoidal categories

    View Slide

  41. Story Generation
    goal: 

    has_role(!, )
    41
    Searching for plots with specified endings

    View Slide

  42. Example: Story Generation
    Output: computer-generated fan fiction
    42

    View Slide

  43. Narrative variation?
    goal: 

    Exists C. 

    has_role(C, ruler_of_westeros)
    43
    Searching for plots with varied endings

    View Slide

  44. Planners & theorem provers find the most efficient solution


    : Exists C. has_role(C, ruler_of_westeros)
    44
    Narrative variation?

    View Slide

  45. Planners & theorem provers find the most efficient solution
    has_role(rb, ruler_of_westeros) |- 


    : Exists C. has_role(C, ruler_of_westeros)
    45
    Narrative variation?

    View Slide

  46. Planners & theorem provers find the most efficient solution
    has_role(rb, ruler_of_westeros) |- 


    “The end.”
    : Exists C. has_role(C, ruler_of_westeros)
    46
    Narrative variation?

    View Slide

  47. Planners & theorem provers find the most efficient solution
    has_role(rb, ruler_of_westeros) |- 


    “The end.”
    : Exists C. has_role(C, ruler_of_westeros)
    Stories are anti-efficient!
    47
    Narrative variation?

    View Slide

  48. A Partial Solution: Nondeterministic Proof Construction
    Linear logic for story generation: Martens et al. 2013 (LPNMR)
    Use forward chaining nondeterminism to leave goals unspecified,

    randomly explore the search space
    48
    Sacrifice completeness of search for variation in outcome

    View Slide

  49. Another Limitation
    Stories generate interest e.g. through characters with conflicting goals
    Subterfuge, betrayal, suspense, etc. all depend on characters acting on goals with
    limited knowledge
    49

    View Slide

  50. Meanwhile, in program synthesis…
    50

    View Slide

  51. Adding constraints
    Program synthesis:

    Refining types with input-output examples (test cases)


    51
    f : int -> nat

    | f 1 => 1

    f 2 => 4

    f 5 => 25

    View Slide

  52. Adding constraints
    Story generation:
    Enforcing intentional behavior in characters
    52

    View Slide

  53. Intentional reasoning
    From classical AI to modern FP
    Plots as plans as proofs
    Intentional reasoning
    Epistemic reasoning
    53

    View Slide

  54. Game AI: from opponents to cooperators
    Computer opponents (e.g. Chess, Go):

    moves need not be intelligible to human
    Computer cooperators:

    need to move in a legible, intentional-seeming way
    54

    View Slide

  55. Grice's Maxims of Conversation
    ❖ Quantity - say neither too much nor too little
    ❖ Quality - tell the truth
    ❖ Relation - what is said should be relevant to
    the topic/stated intentions
    ❖ Manner - avoid obscurity and ambiguity
    55
    (Paul Grice, “Logic and Conversation,” 1975)

    View Slide

  56. Hanabi
    Your hand is visible
    only to others
    Your shared goal is to
    build sequentially-
    ordered stacks of like-
    color cards
    56

    View Slide

  57. Hanabi
    One turn:
    Play,
    Discard, or
    Give hint (spends a
    token)
    57

    View Slide

  58. Hanabi: hints
    58

    View Slide

  59. Computational cooperation in Hanabi
    ❖ Model the player’s knowledge, including their
    model of our knowledge
    ❖ Give hints to human players that will be
    recognized as intentional: point out cards when
    unambiguous whether to play or discard


    59

    View Slide

  60. Computational cooperation in Hanabi
    ❖ Model the player’s knowledge, including their
    model of our knowledge
    ❖ Give hints to human players that will be
    recognized as intentional: point out cards when
    unambiguous whether to play or discard


    60
    }“Intentional”

    model

    View Slide

  61. Computational cooperation in Hanabi
    ❖ Model the player’s knowledge, including their
    model of our knowledge
    ❖ Give hints to human players that will be
    recognized as intentional: point out cards when
    unambiguous whether to play or discard
    ❖ Interpret hints from human players as carrying the
    same intent; play or discard appropriately
    61
    }“Intentional”

    model
    {
    “Full”

    model

    View Slide

  62. Computational Intelligence in Games (CIG) 2017
    3 implementations (base, intentional, full)
    +
    experimental evaluation
    1. Simulation of computer:computer
    2. Human subjects study, human:computer
    62

    View Slide

  63. Computational Intelligence in Games (CIG) 2017
    63

    View Slide

  64. Computational Intelligence in Games (CIG) 2017
    64
    Players rated more intentional agents both
    more enjoyable and more intentional-seeming

    View Slide

  65. Grice's Maxims of Conversation
    Quantity - say neither too much nor too little
    ❖ Enforced by game rules
    65

    View Slide

  66. Grice's Maxims of Conversation
    Quality - tell the truth
    ❖ Players have shared goals
    66

    View Slide

  67. Grice's Maxims of Conversation
    Relation - be relevant
    ❖ Only give hints that can be
    interpreted as play/discard
    instructions
    67

    View Slide

  68. Grice's Maxims of Conversation
    Manner - avoid ambiguity
    ❖ Only hint sets of cards for
    which the actionable subset is
    unambiguous
    68

    View Slide

  69. Limitation
    Only one step of look-ahead
    How to apply intentional reasoning to search?
    69

    View Slide

  70. Intentionality depends on Belief
    Transition from ad-hoc encoding of knowledge to formal
    Describe how beliefs influence actions and vice versa
    70

    View Slide

  71. Epistemic reasoning
    From classical AI to modern FP
    Plots as plans as proofs
    Intentional reasoning
    Epistemic reasoning
    71

    View Slide

  72. Theory of Mind
    Forming mental models of other
    people’s mental models
    72

    View Slide

  73. Theory of Mind
    Forming mental models of other
    people’s mental models
    A key form of social reasoning in
    humans
    73

    View Slide

  74. 3 Logicians Walk into a Pub
    74
    Will everyone
    have a beer?

    View Slide

  75. 3 Logicians Walk into a Pub
    75
    Will everyone
    have a beer?
    I don’t know

    View Slide

  76. 3 Logicians Walk into a Pub
    76
    Will everyone
    have a beer?
    I don’t knowI don’t know

    View Slide

  77. 3 Logicians Walk into a Pub
    77
    Will everyone
    have a beer?
    I don’t knowI don’t know Yes!

    View Slide

  78. Agent a believes A

    Epistemic Logic
    ☐a A

    78

    View Slide

  79. Dynamic Epistemic Logic
    79
    ☐a A

    [α]A
    Agent a believes A

    A holds under action α

    View Slide

  80. Agent a believes A

    A holds under action α
    Dynamic Epistemic Logic
    ☐a A

    [α]A
    80
    Actions may change the 

    actual world or the

    set of worlds an agent considers possible

    View Slide

  81. Epistemic Actions (Baltag 2002)
    α, β ::= flip p 

    | ?A 

    | α + β 

    | α ; β 

    Change truth value

    Precondition

    Nondeterministic choice

    Sequence

    81

    View Slide

  82. Epistemic Actions (Baltag 2002)
    α, β ::= flip p 

    | ?A 

    | α + β 

    | α ; β 

    | αa 

    | α*
    Change truth value

    Precondition

    Nondeterministic choice

    Sequence

    Appearance to a

    Public action
    82

    View Slide

  83. Semantics: Possible Worlds
    Epistemic states:
    sets of world each agent considers possible
    83

    View Slide

  84. Semantics: Possible Worlds
    An actual world
    84
    Set of world an agent considers

    possible

    View Slide

  85. Semantics: Possible Worlds
    These sets can expand and contract based on actions
    85

    View Slide

  86. Semantics: Possible Worlds
    86
    α
    Apply α

    View Slide

  87. α
    α
    α
    α
    Semantics: Possible Worlds
    87
    α
    Apply α

    View Slide

  88. Semantics: Possible Worlds
    88
    Apply α + β

    View Slide

  89. Semantics: Possible Worlds
    89
    α
    Apply α + β
    β
    or
    α
    α
    α
    α
    β
    β
    β
    β

    View Slide

  90. Semantics: Possible Worlds
    90
    Apply ?( v ); α

    View Slide

  91. Semantics: Possible Worlds
    91
    α
    Apply ?( v ); α
    α α

    View Slide

  92. –Petyr Baelish
    “Fight every battle everywhere, always, in your mind. Everyone is
    your enemy, everyone is your friend. Every possible series of events
    is happening all at once. Live that way and nothing will surprise
    you. Everything that happens will be something that you’ve seen
    before.”
    92

    View Slide

  93. A tiny example
    Alice flips a coin in secret;

    Bob and Carol observe
    93
    coin(heads)Alice + coin(tails)Alice; 

    (coin(heads) + coin(tails)){Bob, Carol}

    View Slide

  94. A practical action language?
    94
    Aggregate sums/products for expressing more than Boolean changes

    Term size blowup

    View Slide

  95. Ostari - Eger & Martens 2017
    Parameterized actions

    Parameters can be secret to specific sets of agents
    95
    ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩)

    View Slide

  96. Ostari - Eger & Martens 2017
    ::=

    := 

    | learn 

    | public 

    | ;
    96
    ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩)

    View Slide

  97. Ostari - Eger & Martens 2017
    ::=

    := 

    | learn 

    | public 

    | ;
    97
    ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩)

    View Slide

  98. Ostari - Eger & Martens 2017
    ::=

    := 

    | learn 

    | public 

    | ;
    ::=

    exists x.

    | all x.

    | each x.

    | which x.
    98
    ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩)

    View Slide

  99. Ostari - Eger & Martens 2017
    ::=

    := 

    | learn 

    | public 

    | ;
    ::=

    exists x.

    | all x.

    | each x.

    | which x.
    99
    ⟨action⟩ ::= ⟨identifier⟩(⟨params⟩)

    View Slide

  100. Epistemic Quantifiers
    100
    learn (p): exists c in Cards: at(deck, 1) == c.
    Useless knowledge

    View Slide

  101. Epistemic Quantifiers
    101
    learn (p): which c in Cards: at(deck, 1) == c.
    Useful knowledge

    View Slide

  102. Example
    102
    hintcolor(p: Players, c: Colors)
    learn (p): Each i in HandIndex:
    color(at(p, i)) == c
    Hinting a color in Hanabi

    View Slide

  103. Implementation in Haskell

    (laziness useful for possible worlds representation!)
    Sequences of actions and queries
    Goals that generate action sequences
    Ostari - Eger & Martens 2017
    103

    View Slide

  104. Epistemic Narrative Generation
    WorldA:

    at(Sherlock) = BakerStreet

    looks like (Sherlock): WorldA

    looks like (Moriarty): WorldA, WorldB
    WorldB:

    at(Sherlock) = ScotlandYard

    looks like (Sherlock): WorldB

    looks like (Moriarty): WorldA, WorldB
    104

    View Slide

  105. Epistemic Narrative Generation
    goal: 

    murdered(Moriarty, Victim) ^

    ☐(Moriarty): ☐(Sherlock):
    exists M. 

    M != Moriarty ^ murdered(M, Victim)
    105

    View Slide

  106. Epistemic Narrative Generation
    Generates a story in which Moriarty frames someone for murder
    goal: 

    murdered(Moriarty, Victim) ^

    ☐(Moriarty): ☐(Sherlock):
    exists M. 

    M != Moriarty ^ murdered(M, Victim)
    106

    View Slide

  107. Summary
    3 investigations into using logic for creativity:
    1. Generating narrative with planning/linear logic
    2. Intentional reasoning in the context of cooperation
    3. Codifying intentionality with Epistemic Logic
    107

    View Slide

  108. From modern FP to future AI
    From classical AI to modern FP
    Plots as plans as proofs
    Epistemic reasoning
    Intentional reasoning
    From modern FP to future AI
    108

    View Slide

  109. Compositionality
    FP’s secret weapon
    [A ∘ B] == [A] [∘] [B]
    109

    View Slide

  110. Compositionality
    110
    x f g z
    y y
    Function composition (meaning [-] is interface)

    View Slide

  111. f ∘ g
    Compositionality
    Function composition (meaning [-] is interface)
    111
    x z

    View Slide

  112. Compositionality
    Proof composition (use of lemmas)
    112

    C
    A

    Δ, ⊢
    A

    View Slide

  113. Compositionality
    113
    ,
    C
    Δ

    Proof composition (use of lemmas)

    View Slide

  114. Compositionality
    In planning: Frame Principle
    (“horizontal” composition)
    114
    Δ Δ’
    π

    ρ

    View Slide

  115. Compositionality
    In planning: Frame Principle
    (“horizontal” composition)
    115
    Δ Δ’
    π

    ||

    ρ

    View Slide

  116. Compositionality
    In planning linear logic: Frame Principle
    (“horizontal” composition)
    116
    Δ Δ’

    π

    ||

    ρ

    View Slide

  117. Research Challenges
    Dynamic logics with functional action languages
    Constructive/lambda-calculable DEL
    Epistemic session types
    117

    View Slide

  118. Compositional Creativity?
    asking computers to generate 

    insights, behaviors, narrative

    based on logical specifications
    118

    View Slide

  119. Why Games and Stories?
    Logic is a great intermediate language
    between humans and machines, but
    insufficient
    Humans have been communicating through
    play and storytelling for even longer than
    they have been doing logic
    The best abstractions are discovered in
    diverse disciplines!
    119

    View Slide

  120. Thanks!
    Chris Martens
    http://go.ncsu.edu/martens
    [email protected]
    @chrisamaphone
    github.com/chrisamaphone
    120

    View Slide