Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Sentence_simplification_with_deep_reinforcement_learning.pdf

MARUYAMA
May 21, 2018
61

 Sentence_simplification_with_deep_reinforcement_learning.pdf

MARUYAMA

May 21, 2018
Tweet

Transcript

  1. Sentence Simplification
    with Deep Reinforcement
    Learning
    Proceedings of the 2017 Conference on Empirical Methods in
    Natural Language Processing, 2017 pp. 584–594
    Zhang Xingxing, Lapata Mirella
    Nagaoka University of Technology Takumi Maruyama

    View Slide

  2. Abstract
    Ø Sentence simplification aims to make sentences
    easier to read and understand
    Ø This paper proposes encoder-decoder model
    coupled with a deep reinforcement learning
    frame work for text simplification
    Ø The proposed model outperforms competitive
    simplification systems on experiments.
    2

    View Slide

  3. Reinforcement Learning for Sentence Simplification
    Ø This paper proposes following two models:
    • Deep Reinforcement learning sentence simplification
    model (DRESS)
    • DRESS + Lexical Simplification model (DRESS-LS)
    3

    View Slide

  4. DRESS
    Ø The overview of deep reinforcement learning
    simplification model
    4

    View Slide

  5. DRESS
    Ø The overview of deep reinforcement learning
    simplification model
    Encoder-Decoder model
    5

    View Slide

  6. DRESS
    Ø The overview of deep reinforcement learning
    simplification model
    Reinforcement Learning
    6

    View Slide

  7. DRESS
    Ø Reward
    ! ", $, %
    $ = '(!( ", $, %
    $
    + '*!* ", %
    $
    + '+!+ %
    $
    ,-, ,., ,/ ∈ [0, 1]
    where
    5-: The simplicity reward
    5.: The relevance reward
    5/: The fluency reward
    7

    View Slide

  8. DRESS
    Ø Reward
    • Simplicity:
    • Relevance:
    • Fluency:
    !" = $%&'( ), +
    ,, ,
    + . − $ %&'( ), ,, +
    ,
    !0 = 123 4)
    , 4+
    ,
    =
    4)
    5 4+
    ,
    4)
    4+
    ,
    !6 = 789
    .
    +
    ,
    :
    ;<.
    +
    ,
    =2>[email protected]
    B
    C;
    |B
    CE:;G.
    4)
    and 4+
    ,
    are sentence vectors
    8

    View Slide

  9. DRESS-LS
    Ø Lexical simplification is a task that replaces
    complex words with simpler alternatives
    Ø This paper uses pre-trained encoder-decoder
    model for lexical simplification
    Ø ! "#
    "$:#&$
    , ( = 1 − , -./
    "#
    "$:#&$
    , ( + ,-/1
    "#
    (, 2#
    Where , ∈ [0,1]
    9

    View Slide

  10. Experimental Setup
    Ø Three simplification datasets
    • WikiSmall (Zhu et al. 2010)
    • WikiLarge (Kauchak 2013, Woodsend and Lapata 2011, Zhu et al. 2010)
    • Newsela (Xu et al. 2015)
    Dataset Train Dev. Test
    WikiSmall 89,042 205 100
    WikiLarge 296,402 2,000 359
    Newsela 94,208 1,129 1,076
    10

    View Slide

  11. Experimental Setup
    Ø Comparison systems
    • PBMT-R:
    • Hybrid:
    A hybrid semantic-based model that combine
    simplification model and monolingual machine
    translation model
    • SBMT-SARI:
    A syntax-based translation model trained with PPDB and
    tuned with SARI
    A monolingual phrase base machine translation
    with a reranking post-processing step
    11

    View Slide

  12. Results
    (Five point Likert scale) 12

    View Slide

  13. Results
    (Five point Likert scale) 13

    View Slide

  14. Results
    (Five point Likert scale) 14

    View Slide

  15. Results
    15

    View Slide

  16. Results
    16

    View Slide