BERT for Text Classification with Keras/TensorFlow 2
In this workshop, we'll learn how we can utilize BERT, a technique for natural language processing (NLP) pre-training to perform our own tasks. For this workshop we will use BERT in TensorFlow 2 for a text classification task.
“...we train a general-purpose ‘language understanding’ model on a large text corpus (like Wikipedia), and then use that model for downstream NLP tasks that we care about (like question answering)” https://github.com/google-research/bert
“BERT outperforms previous methods because it is the first unsupervised, deeply bidirectional system for pre-training NLP.” https://github.com/google-research/bert