OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

Ad8ae7af280edaecb09bd73a551b5e5f?s=47 OpenTalks.AI
February 21, 2020

OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

Ad8ae7af280edaecb09bd73a551b5e5f?s=128

OpenTalks.AI

February 21, 2020
Tweet

Transcript

  1. Как использовать BERT на PyTorch базовая информация и применение к

    прикладной задаче
  2. NLP's ImageNet moment [source for the picture]

  3. What is BERT? [source for the picture]

  4. Model Inputs

  5. Model Outputs

  6. Model outputs

  7. Why BERT embeddings? • In the past, neural word embeddings

    that result from models like Word2Vec or Fasttext were the most powerful ones • Each word has a fixed representation under Word2Vec regardless of the context • BERT produces word representations that are dynamically informed by the words around [source for the picture]
  8. BERT for sentiment analysis [source for the picture]

  9. Dataset: SST2 The dataset we will use in this example

    is SST2, which contains sentences from movie reviews, each labeled as either positive (has the value 1) or negative (has the value 0): [source for the picture]
  10. Models: Sentence Sentiment Classification [source for the picture]

  11. Flowing through DistilBERT

  12. Flowing through DistilBERT

  13. Step #1 [source for the picture]

  14. Step #2 [source for the picture]

  15. Step #3 [source for the picture]

  16. Proceed to Google Colab

  17. References • Alammar, Jay (2018). The Illustrated BERT, ELMo, and

    co. (How NLP Cracked Transfer Learning) [Blog post]. • Alammar, Jay (2019). A Visual Guide to Using BERT for the First Time [Blog post]. • Chris McCormick (2019). BERT Word Embeddings Tutorial [Blog post].