Upgrade to Pro — share decks privately, control downloads, hide ads and more …

2020年AI論文まとめ

874ff503a00697a857e198a0ebb8f55f?s=47 ai.labo.ocu
December 02, 2020

 2020年AI論文まとめ

2020年AI論文まとめ

874ff503a00697a857e198a0ebb8f55f?s=128

ai.labo.ocu

December 02, 2020
Tweet

Transcript

  1. ⼤阪市⽴⼤学 植⽥ ⼤樹 2020年AIまとめ AI論⽂読み会を通年開催して

  2. Top Recent in Last Year 1. End-to-End Object Detection with

    Transformers 2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  3. Top Recent in Last Year 1. End-to-End Object Detection with

    Transformers 2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention 当勉強会扱った論⽂
  4. どんなAIとの⼀年でした?

  5. Transformer 1. End-to-End Object Detection with Transformers 2. A Simple

    Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  6. Contrastive Learning 1. End-to-End Object Detection with Transformers 2. A

    Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  7. Transformer and Contrastive Learning 1. End-to-End Object Detection with Transformers

    2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  8. Transformer エンコーダ・デコーダ

  9. Transformer エンコーダ・デコーダ

  10. Transformer エンコーダ・デコーダ 私は猫を飼っている I have cats → → 圧縮された情報 =

    特徴 ↑
  11. Transformer

  12. Transformer エンコーダ・デコーダ

  13. Transformer “エンコーダ・デコーダ" 主にAttentionという⼯夫を施した

  14. Transformer Attention Attention is All You Need 関連度を抽出するモデル Sequential dataから

  15. Transformer Attention Query/key I have cats I 0.88 0.10 0.02

    have 0.08 0.80 0.12 cats 0.03 0.14 0.83 I have cats
  16. Transformer “エンコーダ・デコーダ" 主にAttentionという⼯夫を施した

  17. Transformer “エンコーダ・デコーダ" 主にAttentionという⼯夫を施した 関連度(特徴)をAttentionで学習するモデル =

  18. Transformer 代表モデル 検出問題 分類問題 ⾃然⾔語処理 ViT DETR BERT, GPT etc…

    End-to-End Object Detection with Transformers: 6⽉勉強会 An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: 11⽉勉強会 → Deformable DETR
  19. Transformer DETR CNN + Transformer のDetectionモデル

  20. Transformer DETR 概念的なベースのモデル CNN End-to-end people detection in crowded scenes

    RNN
  21. Transformer DETR 概念的なベースのモデル CNN End-to-end people detection in crowded scenes

    RNN
  22. Transformer Attention Attention is All You Need 関連度を抽出するモデル Sequential dataから

  23. Transformer Attention Attention is All You Need 関連度を抽出するモデル Sequential dataから

  24. Transformer DETR 概念的なベースのモデル CNN End-to-end people detection in crowded scenes

    RNN Sequential data! ←
  25. Transformer DETR 概念的なベースのモデル CNN End-to-end people detection in crowded scenes

    RNN Transformerに! ←
  26. Transformer DETR CNN Transformer

  27. Transformer DETR

  28. Transformer ViT Transformerのエンコーダを使ったClassificationモデル

  29. Transformer エンコーダ・デコーダ 圧縮された情報 = 特徴 ↑

  30. Transformer エンコーダ Classification

  31. Transformer エンコーダ Classification → → 分類

  32. Transformer Attention Attention is All You Need 関連度を抽出するモデル Sequential dataから

  33. None
  34. None
  35. None
  36. Sequential data! ↓

  37. Sequential data! Transformer エンコーダ → → 分類タスク Transformer ViT

  38. Transformer ViT

  39. None
  40. Contrastive Learning 概念 対称性を活かした学習

  41. Contrastive Learning 概念 →より⼈間らしい学習の導⼊ モデル本体以外の部分で 特に、 “Unsupervised Learning” での驚き 対称性を活かした学習

  42. Contrastive Learning 代表モデル SimCLR A Simple Framework for Contrastive Learning

    of Visual Representations: 3⽉勉強会
  43. Contrastive Learning アーキテクチャ CNN ⼊⼒の⼯夫 ロスの⼯夫 (ResNet50とか)

  44. Contrastive Learning ⼯夫たち https://ai-scholar.tech/ ⼊⼒の⼯夫 ロスの⼯夫 Augmentation 同じものを近く 遠いものを近く →

  45. 来年の予想 Transformer、真の実⽤化! →洗練と軽量化に期待。 その他、 Segmentation taskへの応⽤ GAN・Contrastive Learningとの融合・・・

  46. 来年の抱負 ・AI論⽂勉強会続ける ・Transformer応⽤する ・医療応⽤AI専⾨の勉強会定期開催

  47. 来年の抱負

  48. None
  49. 良い、お年を