How Does TikTok Actually Work? (It’s Scary…)
Key Moments
TikTok's algorithm is a complex "two-tower" system, not an "editor." It blindly optimizes for engagement, potentially exploiting dark human impulses. US control won't fix its fundamental nature.
Key Insights
TikTok's "algorithm" is actually a sophisticated "two-tower" recommender system architecture, not a simple editorial process.
This system uses machine learning to create embeddings (lists of numbers) for both users and videos, matching them based on learned patterns.
The system's "intelligence" comes from its ability to process vast amounts of user feedback (watch time, likes, swipes) in near real-time, which short-form video is ideal for.
Recommendation algorithms are inherently value-agnostic, optimizing for engagement by approximating underlying systems, which can include undesirable human impulses.
Changing the ownership of TikTok to US control (e.g., Oracle) is unlikely to fundamentally fix the system's core issues, as the algorithm itself lacks human values.
Human curation, despite potential biases, is preferable to algorithmic curation for news and information, as it incorporates human values and guardrails.
THE ALGORITHM DEBATE: FROM MYTH TO REALITY
The recent news surrounding potential US control of TikTok's operations centers on its powerful recommendation algorithm. Many perceive this algorithm as a digital editor, shaping content for users. However, this understanding is flawed. The discussion highlights a common misconception where the term "algorithm" is imbued with almost mystical power. Cal Newport, drawing on his computer science background, aims to demystify how these systems actually function, moving beyond the popular narrative to reveal the underlying technical architecture. The goal is to understand the system's true capabilities and limitations, especially in relation to its societal impact.
THE TECHNOLOGY BEHIND THE FEED: A "TWO-TOWER" SYSTEM
At its core, TikTok's recommendation engine is not a single algorithm but a complex distributed system, more accurately termed a "recommender system architecture." This system likely employs a "two-tower" model. One tower processes incoming videos, generating a "property list" or vector representing each video's characteristics. The second tower processes user profiles, particularly their on-platform behavior, to create a similar vector representing their interests. Both towers use machine learning, including neural networks and transformers, to map videos and users into a common vector space.
MACHINE LEARNING AND REAL-TIME FEEDBACK LOOPS
The "training" of these towers is largely semi-supervised, driven by vast amounts of user interaction data. The system learns by observing which videos users watch, like, or skip. The goal is to align the user's interest vector closely with the video's property vector. TikTok's success stems partly from the short-form video format, which generates rapid feedback. Users can cycle through dozens of videos per session, providing a rich, real-time data stream that allows the system to continuously update user profiles, leading to remarkably accurate initial recommendations and a strong "cold start" capability.
BEYOND PURE USER PREFERENCE: POPULARITY AND DIVERSITY
While user-specific interests are primary, TikTok's system also incorporates elements of trending or globally popular content. This "short-term profile" complements the "long-term profile" (user description). By mixing highly personalized recommendations with broadly popular or trending videos, the system introduces novelty and discovery. This strategy allows users to encounter content that might not precisely match their past behavior but is currently resonating with a wider audience. This blend contributes to the addictive and surprising nature of the feed, preventing it from becoming too predictable.
THE ETHICAL DUALITY: BLIND OPTIMIZATION VERSUS HUMAN VALUES
Crucially, machine learning recommendation systems like TikTok's are designed to optimize for engagement metrics, not to embody human values. They are essentially building mathematical approximations of underlying patterns in the data. This process is inherently value-agnostic. The system doesn't understand concepts like truth, ethics, or harm; it merely seeks to maximize positive feedback signals. As a result, it can inadvertently amplify negative human impulses like hatred, violence, or sensationalism, much like a "psychopathic newspaper editor" that lacks empathy and humanistic guardrails, leading to potentially harmful content proliferating.
RETHINKING CONTROL AND CURATION: HUMAN VS. ALGORITHMIC
The idea of US control over TikTok's algorithm, while addressing potential foreign influence, does not solve the fundamental problem. The algorithmic architecture itself is the issue. It operates blindly, optimizing for engagement without ethical consideration. This contrasts sharply with human curation, which, despite potential biases, incorporates normative standards and values developed over centuries. For news and information consumption, human-curated sources or those with human oversight are preferable, allowing for context, critical evaluation, and a consideration of civic values that algorithms inherently lack. The focus should shift from who controls the algorithm to the inherent limitations of algorithmic curation itself.
Mentioned in This Episode
●Software & Apps
●Companies
●Organizations
●Books
●Concepts
●People Referenced
Common Questions
TikTok uses a 'two tower' system where one tower processes videos to create property lists and the other processes user data to create interest profiles. These are then matched to recommend videos.
Topics
Mentioned in this video
Mentioned as a core method for secure commerce on the internet that relies on large prime factors, potentially vulnerable to quantum computing.
The podcast hosted by Stephen, who previously had Cal Newport as a guest.
Mentioned sarcastically as a potential positive TikTok influence, contrasting with negative real-world examples.
Referenced to highlight the historical context of mass content production and its impacts.
Referenced as a theme for a Halloween haunt, with a specific mention of a 9-foot inflatable prop from Wayfair.
Mentioned as publishing research papers about their work.
A smart calendar assistant that defends focus time, autoschedules meetings, and resolves conflicts.
An all-in-one personal finance tool recommended for its visibility and ease of use.
More from Cal Newport
View all 118 summaries
88 minIt's Time To Uninstall And Improve Your Life | Cal Newport
30 minDid the AI Job Apocalypse Just Begin? (Hint: No.) | AI Reality Check | Cal Newport
95 minHow To Plan Better | Simple Analog System | Cal Newport
19 minHas AI Changed Work Forever? Not Really... | Cal Newport
Found this useful? Build your knowledge library
Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.
Try Summify free