Y

YouLibs

Remove Touch Overlay

BERT Masked Language Modelling: Should I pre-train on domain-specific text?

Duration: 20:19Views: 6.9KLikes: 306Date Created: Nov, 2020

Channel: tanmay bakshi

Category: Science & Technology

Description: In this video, I answer a question about BERT: Should I be pre-training a second time, with domain specific text? Usually, BERT is fine tuned directly on a downstream task. However, sometimes there is a limitation for how much data you can get for this task. Therefore, you may want to consider pre-training BERT in an unsupervised fashion on unlabelled data that relates to the domain in which your labelled data sits. Code: github.com/tanmayb123/BertPreTraining

Swipe Gestures On Overlay