L
LLaMA 2 paper
Study / ResearchMentioned in 2 videos
Mentioned to show how tokens are pervasive in model descriptions and to motivate tokenizer training/coverage decisions (e.g., tokens trained on large corpora).
Mentioned to show how tokens are pervasive in model descriptions and to motivate tokenizer training/coverage decisions (e.g., tokens trained on large corpora).