MapReduce
Software / App
A programming model developed by Google to process large data sets in parallel, created out of necessity when their original indexing process failed due to the web's rapid growth.
Mentioned in 3 videos
Save the 3 videos on MapReduce to your own pod.
Sign up free to keep building your knowledge base on MapReduce as more episodes are added.
Videos Mentioning MapReduce

Things That Don't Scale, The Software Edition – Dalton Caldwell and Michael Seibel
Y Combinator
A programming model developed by Google to process large data sets in parallel, created out of necessity when their original indexing process failed due to the web's rapid growth.

E156: Ivy League antisemitism, macro, SaaS recovery, Gemini, Figma deal delay + big Friedberg update
All-In Podcast
A programming model and algorithm for processing large data sets, part of Jeff Dean's notable technical contributions.

Work at a Startup Expo 2019
Y Combinator
A programming model used by HealthSherpa for scalable backend data products, indicating their handling of large datasets.