BERT - Engineers SG presentation
Contents
What did I learn from the talk on BERT with Engineers SG
- Old way of building stuff
- Build model - Glove embedding; Train
- Need lots of data
- New way of building stuff
- Use Pretrained BERT
- Finetune on unlabelled data
- Train on labelled data
- Less data required
- Expect better results
- Use Pretrained BERT
- BERT released in 2018 Oct
- English + 102 languages
- BERT
- Take a model pretrained on huge corpus
- Do additional training on your labeled data
- Learn actual task - using only a few examples
- Innovations
- Sentence Piece
- Transformers
- Language Model Tasks
- Fine-tuning