Upgrade to Pro — share decks privately, control downloads, hide ads and more …

2020年AI論文まとめ

 2020年AI論文まとめ

2020年AI論文まとめ

More Decks by 医療AI研究所@大阪公立大学

Other Decks in Research

Transcript

  1. Top Recent in Last Year 1. End-to-End Object Detection with

    Transformers 2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  2. Top Recent in Last Year 1. End-to-End Object Detection with

    Transformers 2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention 当勉強会扱った論⽂
  3. Transformer 1. End-to-End Object Detection with Transformers 2. A Simple

    Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  4. Contrastive Learning 1. End-to-End Object Detection with Transformers 2. A

    Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  5. Transformer and Contrastive Learning 1. End-to-End Object Detection with Transformers

    2. A Simple Framework for Contrastive Learning of Visual Representations 3. A Primer in BERTology: What we know about how BERT works 4. Reformer: The Efficient Transformer 5. Language Models are Few-Shot Learners 6. Supervised Contrastive Learning 7. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch 8. Explainable Deep Learning: A Field Guide for the Uninitiated 9. YOLOv4: Optimal Speed and Accuracy of Object Detection 10.Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
  6. Transformer Attention Query/key I have cats I 0.88 0.10 0.02

    have 0.08 0.80 0.12 cats 0.03 0.14 0.83 I have cats
  7. Transformer 代表モデル 検出問題 分類問題 ⾃然⾔語処理 ViT DETR BERT, GPT etc…

    End-to-End Object Detection with Transformers: 6⽉勉強会 An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: 11⽉勉強会 → Deformable DETR