Masked LM

Concept

Masked Language Model, a pre-training objective in BERT where a percentage of input tokens are masked, and the model must predict them.

Mentioned in 1 video