BART
Software / App
A model mentioned as an example of pre-training objectives considered before auto-regressive modeling became dominant.
Mentioned in 3 videos
Videos Mentioning BART

Beating GPT-4 with Open Source Models - with Michael Royzen of Phind
Latent Space
A denoising autoencoder for pretraining sequence-to-sequence models, used in an early Hugging Face demo for long-form question answering and later fine-tuned by Michael Royzen.

Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI
Y Combinator
A model mentioned as an example of pre-training objectives considered before auto-regressive modeling became dominant.

Building A $2 Billion SaaS Company: Lessons From A Two Time Founder
Y Combinator