Upgrade to Pro — share decks privately, control downloads, hide ads and more …

【論文紹介】Attention-GAN

nodaki
July 14, 2018

 【論文紹介】Attention-GAN

Attenotion-GAN for Object Transfiguration in Wild Images

https://arxiv.org/abs/1803.06798

nodaki

July 14, 2018
Tweet

More Decks by nodaki

Other Decks in Research

Transcript

  1. ֓ཁ Attention-Gan For Object Transfiguration In Wild Images ▸ ॻࢽ৘ใ

    https://arxiv.org/abs/1803.06798 Submitted : 19 Mar 2018 ▸ ֓ཁ ը૾υϝΠϯؒͷࣸ૾ʢม׵ʣΛֶश͢ΔωοτϫʔΫ AttentionػߏΛऔΓೖΕΔ͜ͱʹΑͬͯ஫໨͢΂͖ྖҬ͚ͩࣸ૾͢Δ͜ͱ͕Ͱ͖ɺ ΑΓ៉ྷͳը૾Λੜ੒͢Δ͜ͱʹ੒ޭ
  2. Ϟσϧ Attention GAN = Cycle GAN + Attention Mechanism ▸

    Cycle consistency loss ▸ Attention loss X →Y →X Y→X →Y Attention cycle-consistent loss Attention sparse loss ஫໨ྖҬ͕ՄೳͳݶΓখ͞ͳʢεύʔεͳʣྖҬʹͳΔΑ͏ʹL1ϊϧϜΛ௥Ճ
  3. Ϟσϧ Supervised Learning ▸ Attention ≒ Segmentation Segmentation label͕͋ΔͷͳΒAttention network͸


    segmentation໰୊Λղ͘Α͏ʹֶशͤ͞Ε͹ྑ͍ Attention supervised loss Total loss λ͸cycle consistent loss ͱ attention loss ͷॏཁ౓Λίϯτϩʔϧ͢Δ
  4. ֶश Experiments ▸ Datasets ImageNet: tigert 1444 images, leopard 1396

    images MSCOCO: horse, zebra 286 x 286 ʹϦαΠζͨ͠ޙɺϥϯμϜʹ256 x 256Ͱ੾Γग़͠ ▸ Training strategy Optimizer: Adam, LR: 0.0002 (~100epoch), 0ʹͳΔΑ͏ʹઢܗʹݮਰ(~200epoch) Batch size: 1
  5. ݁Ռ Comparison with CycleGAN ▸ ఆੑൺֱʢࠨஈ: input, தஈ: AttentionGAN, ӈஈ:

    CycleGAN) ▸ ఆྔൺֱ AttentionGANͰ͸ ಛʹinputͷഎܠ৘ใΛอ͍ͯͯΔ AttentionGAN, CycleGANͷ ͲͪΒ͕ΑΓྑ͍ը૾Λ ੜ੒͍ͯ͠Δ͔ͷΞϯέʔτ݁Ռ (੨: AttentionGAN, ੺: CycleGAN)