Upgrade to Pro — share decks privately, control downloads, hide ads and more …

OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

OpenTalks.AI
February 21, 2020

OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

OpenTalks.AI

February 21, 2020
Tweet

More Decks by OpenTalks.AI

Other Decks in Science

Transcript

  1. Why BERT embeddings? • In the past, neural word embeddings

    that result from models like Word2Vec or Fasttext were the most powerful ones • Each word has a fixed representation under Word2Vec regardless of the context • BERT produces word representations that are dynamically informed by the words around [source for the picture]
  2. Dataset: SST2 The dataset we will use in this example

    is SST2, which contains sentences from movie reviews, each labeled as either positive (has the value 1) or negative (has the value 0): [source for the picture]
  3. References • Alammar, Jay (2018). The Illustrated BERT, ELMo, and

    co. (How NLP Cracked Transfer Learning) [Blog post]. • Alammar, Jay (2019). A Visual Guide to Using BERT for the First Time [Blog post]. • Chris McCormick (2019). BERT Word Embeddings Tutorial [Blog post].