OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

Ad8ae7af280edaecb09bd73a551b5e5f?s=47 OpenTalks.AI
February 21, 2020

OpenTalks.AI - Евгения Елистратова, Как использовать BERT на PyTorch - базовая информация и применение к прикладной задаче​

Ad8ae7af280edaecb09bd73a551b5e5f?s=128

OpenTalks.AI

February 21, 2020
Tweet

Transcript

  1. 7.

    Why BERT embeddings? • In the past, neural word embeddings

    that result from models like Word2Vec or Fasttext were the most powerful ones • Each word has a fixed representation under Word2Vec regardless of the context • BERT produces word representations that are dynamically informed by the words around [source for the picture]
  2. 9.

    Dataset: SST2 The dataset we will use in this example

    is SST2, which contains sentences from movie reviews, each labeled as either positive (has the value 1) or negative (has the value 0): [source for the picture]
  3. 17.

    References • Alammar, Jay (2018). The Illustrated BERT, ELMo, and

    co. (How NLP Cracked Transfer Learning) [Blog post]. • Alammar, Jay (2019). A Visual Guide to Using BERT for the First Time [Blog post]. • Chris McCormick (2019). BERT Word Embeddings Tutorial [Blog post].