Upgrade to Pro — share decks privately, control downloads, hide ads and more …

BERT for Text Classification with Keras/TensorFlow 2

Galuh Sahid
October 24, 2020

BERT for Text Classification with Keras/TensorFlow 2

In this workshop, we'll learn how we can utilize BERT, a technique for natural language processing (NLP) pre-training to perform our own tasks. For this workshop we will use BERT in TensorFlow 2 for a text classification task.

Galuh Sahid

October 24, 2020
Tweet

More Decks by Galuh Sahid

Other Decks in Technology

Transcript

  1. A deep learning model is trained on a large dataset,

    then used to perform similar tasks on another dataset (e.g. text classification) Transfer learning
  2. “...we train a general-purpose ‘language understanding’ model on a large

    text corpus (like Wikipedia), and then use that model for downstream NLP tasks that we care about (like question answering)” https://github.com/google-research/bert
  3. “BERT outperforms previous methods because it is the first unsupervised,

    deeply bidirectional system for pre-training NLP.” https://github.com/google-research/bert
  4. Input: the man went to the [MASK1] . he bought

    a [MASK2] of milk. Labels: [MASK1] = store; [MASK2] = gallon Masked language model
  5. Sentence A: the man went to the store . Sentence

    B: he bought a gallon of milk . Label: IsNextSentence Next sentence prediction
  6. Sentence A: the man went to the store . Sentence

    B: penguins are flightless . Label: NotNextSentence Next sentence prediction