Key Moments
The Worst Epidemic (Episode #213)
Key Moments
Child sexual abuse imagery is a global epidemic fueled by tech failures and a lack of accountability, demanding urgent solutions.
Key Insights
The term 'child pornography' is misleading; 'child sexual abuse material' (CSAM) accurately describes the commission of crimes against children.
The scale of CSAM has exploded exponentially with the internet and smartphones, vastly overwhelming law enforcement and tech company efforts.
Tech companies face a privacy versus protection dilemma, with encryption posing a significant hurdle to detecting and preventing CSAM.
Government inaction and underfunding, despite legislative efforts, have hampered effective responses to the CSAM crisis.
The creation and dissemination of CSAM are facilitated by technological infrastructure, with a growing market for such content.
Education and parental involvement are crucial as immediate coping mechanisms, given the current limitations of technological and governmental solutions.
TERMINOLOGY AND THE GRAVITY OF CSAM
The conversation begins by clarifying terminology, distinguishing between the misleading term 'child pornography' and the more accurate 'child sexual abuse material' (CSAM). It emphasizes that CSAM involves the direct commission of sexual crimes against children, often involving rape and torture, and is consumed by a vast market of individuals who find such acts sexually gratifying, highlighting the profound depravity of both perpetrators and consumers.
THE STAGGERING SCALE OF THE PROBLEM
The scale of CSAM has grown exponentially, with reports increasing from under 100,000 in 2007 to nearly 17 million in 2019. This surge is directly linked to the advent of smartphones and widespread internet access, which have made the creation and dissemination of imagery easier than ever. The sheer volume of reports, often comprising millions of files, demonstrates the overwhelming nature of the problem, far surpassing the capacity of law enforcement and support organizations.
TECH COMPANIES' ROLE AND CHALLENGES
Tech companies are central to the CSAM crisis, as their platforms facilitate its spread. While some companies, like Facebook, are actively scanning and reporting CSAM, often leading to high report numbers, others contribute minimally. The discussion highlights the tension between a company's commitment to privacy and its responsibility to protect children. Encryption, particularly in services like Facebook Messenger, poses a significant obstacle, potentially rendering CSAM undetectable.
GOVERNMENTAL FAILURES AND LEGISLATIVE INERTIA
Despite legislative attempts like the 2008 Protect Our Children Act, governmental response has been largely ineffectual. Key provisions, such as adequate funding for law enforcement and mandated reporting on the issue, have gone unfulfilled. The underfunding of Internet Crimes Against Children (ICAC) task forces forces them into triage, often prioritizing cases involving infants and toddlers. The lack of consistent and transparent data collection further exacerbates the problem.
TECHNOLOGICAL SOLUTIONS AND THEIR LIMITATIONS
While technologies like Microsoft's Photo DNA aim to detect CSAM by matching images, they are primarily trained on previously identified material, missing new content. The development of video detection technology lags behind that for images, and proprietary systems create fragmentation, hindering a unified response. Even with advanced tools, the sheer volume of content and the deliberate efforts of offenders to evade detection present significant challenges.
THE ENCRYPTION DEBATE AND PRIVACY IMPLICATIONS
The push for end-to-end encryption, while promoting user privacy, presents a stark dilemma. While privacy is a fundamental right, the inability to scan encrypted communications could render CSAM undetectable, particularly on platforms like Facebook Messenger, which is a major conduit for such material. Critics argue that prioritizing absolute privacy over child protection is a moral failing, especially when platforms facilitate adult-child interactions and potential exploitation through mechanisms like sextortion.
THE CULTURE OF CONSUMPTION AND PERPETRATORS
The discussion delves into the culture surrounding CSAM consumption, with a significant portion of the population reportedly seeking out such material. This includes not only those with pedophilic attractions but also 'extremists,' 'addicts,' and others driven by curiosity or a desire for increasingly extreme stimuli. Perpetrators often radicalize each other online, normalizing their behavior and developing elaborate methods to evade detection, highlighting the deep-seated nature of the problem.
UWEDGES AND THE ROLE OF EDUCATION
Survivors of CSAM face immense ongoing trauma, often receiving constant notifications as their images are rediscovered. The conversation emphasizes that while technological and governmental solutions are urgently needed, immediate efforts must focus on education. Parents are urged to educate their children about online dangers, responsible image sharing, and the importance of feeling safe to disclose mistakes without fear of reprisal, as current technological solutions are insufficient.
LIVESTREAMING AND THE FUTURE OF CSAM
The emergence of livestreaming CSAM, as seen in a disturbing case involving Zoom, presents new challenges because it leaves no digital record. Offenders are increasingly turning to such undetectable formats. The development of classifiers for new content and effective methods to prevent the weaponization of shared images are critical areas for future technological advancement. This highlights the ongoing cat-and-mouse game between those creating CSAM and those trying to detect it.
THE PROSPECT OF FICTIONAL CSAM AND PEDOPHILIA
The controversial idea of creating fictional CSAM (e.g., animated or CGI) to satisfy demand without actual victimization is explored. While currently illegal, the debate centers on whether this could reduce demand for real CSAM or merely normalize interest in the subject. The discussion also touches on the nature of pedophilia itself, acknowledging it as a genuine sexual orientation but unequivocally condemning any act of abuse, regardless of consent claims or the means of production.
Mentioned in This Episode
●Products
●Software & Apps
●Companies
●Organizations
●Concepts
●People Referenced
Common Questions
The proper terminology is 'child sexual abuse material' (CSAM). It is important because 'child pornography' misleadingly suggests consent or adult content, when what is being depicted is always a record of a crime against a child, often involving rape and torture. (Timestamp: 459)
Topics
Mentioned in this video
A news organization where Gabriel Dance previously worked as interactive editor and was part of the group that won a Pulitzer Prize for coverage of NSA surveillance.
A criminal justice news site that Gabriel Dance helped launch, focusing on the death penalty, prison, and policing issues.
The National Security Agency, subject of reporting by The Guardian on widespread secret surveillance, for which Gabriel Dance and colleagues won a Pulitzer Prize.
A non-profit organization that is a leader in pushing for transparency from tech companies regarding their efforts to combat child sexual abuse material.
The federal agency responsible for producing bi-annual reports on child sexual abuse material, but only two out of seven expected reports have been produced.
The newspaper where Gabriel Dance works as Deputy Investigations Editor, investigating technology and online sexual abuse. His reporting on this topic has been published in the NYT.
The Federal Bureau of Investigation, to whom The New York Times journalists are legally obligated to report instances of child sexual abuse material. The FBI, like state and local police, is overwhelmed.
The legislative body that allocated funds for combating child sexual abuse material through the Protect Our Children Act, but only half of the appropriated funds were typically deployed.
A non-profit organization that runs the cyber tip line for child sexual abuse material reports, serving as a clearinghouse for this imagery. NCMEC received 17 million reports in 2019, but its technology is 20 years old and they are overwhelmed.
A non-profit that is developing software to help smaller companies scan and detect child sexual abuse material, as building such systems can be expensive.
Former Chief Security Officer for Facebook and Yahoo, who stated that if other companies reported CSAM as aggressively as Facebook, the numbers would be significantly higher (50-100 million).
Former Attorney General who held an event at the Department of Justice to argue that encryption is enabling child sexual abuse and that a backdoor into encryption is needed.
Host of the Making Sense podcast, introducing the difficult topic of child sexual abuse material and interviewing Gabriel Dance.
Head legal counsel at The New York Times, who advised Gabriel Dance and Michael Keller on the lack of journalistic privilege regarding child pornography and the legal obligation to report it.
Senator who introduced the Earn It Act, a bill that would affect Section 230 protections for companies related to child pornography.
Deputy Investigations Editor at The New York Times, working with a team investigating technology, including online sexual abuse imagery and surveillance capitalism. He was previously the interactive editor for The Guardian, where he won a Pulitzer Prize.
CEO of Facebook, who announced plans to fully encrypt Facebook Messenger, a move that would significantly hinder the detection of child sexual abuse material on the platform.
A Detective Constable from the Toronto Police Department (Canada) who, while undercover, recorded a live stream of child abuse on Zoom, leading to arrests.
A colleague of Gabriel Dance at The New York Times, with whom he primarily investigated child sexual abuse material and who also has a computer science background.
Former NSA contractor known for leaking classified information, who weighed in on The New York Times' series, finding one story particularly credulous to law enforcement's arguments against encryption.
Senator who introduced a bill seeking five billion dollars in funding for law enforcement and others on the front lines against child sexual abuse. This funding would be over 10 years.
A 20-year-old man in Pennsylvania who was arrested for raping a 6-year-old boy on a Zoom live stream, observed by law enforcement, and received a sentence of up to 90 years.
Senator who, along with Debbie Wasserman Schultz, introduced the 2008 Protect Our Children Act, a bipartisan bill aimed at confronting child sexual abuse material.
A person from Dartmouth (now Berkeley) who partnered with Microsoft to invent PhotoDNA in 2009. He believes PhotoDNA is not incredibly complex.
A federal prosecutor who remarked that offenders use live streams because they are harder to detect and leave no record, calling Zoom the 'Netflix of child pornography'.
Senator who introduced the Earn It Act alongside Lindsey Graham.
Representative who, along with Joe Biden, co-introduced the 2008 Protect Our Children Act.
A tech company that invented PhotoDNA and licenses it to most other companies. Microsoft's Bing search engine was found to be serving up child sexual abuse imagery by NYT journalists, despite owning the detection technology.
A cloud storage company that facilitates the spread of CSAM. Dropbox and Google Drive initially only scanned files when shared, but Dropbox started scanning videos after NYT reporting, finding numerous files.
A company that provides in-kind donations to NCMEC to upgrade their systems for detecting child sexual abuse material.
A major tech company that facilitates the spread of child sexual abuse material, with a moral responsibility to address the problem. Facebook has been an industry leader in finding and reporting CSAM.
A streaming service, used in an analogy by a federal prosecutor to describe Zoom as the 'Netflix of child pornography' due to its use for live streaming abuse.
An encrypted messaging service owned by Facebook, suggested as an alternative platform for encrypted communications so that Facebook Messenger could remain unencrypted for child safety purposes. WhatsApp reports significantly fewer CSAM instances.
A major tech company that sits on NCMEC's board and has contributed to upgrading their systems. Google's search engine did not return CSAM in the journalists' tests.
A camera that allowed for instant photo development, described as the 'biggest boon to child sexual abuse imagery' before the internet, due to enabling self-production of abusive material without third-party developers.
The smartphone introduced in 2008, whose invention alongside other high-quality camera phones and broadband, led to an explosion in child sexual abuse material reports.
An instant messaging service by Facebook, responsible for a significant majority (65-72%) of all child sexual abuse material reports made to NCMEC. Planned encryption of the platform is expected to sharply decrease detection.
A proprietary fuzzy image matching technology invented in 2009 by Microsoft and Dr. Hany Farid, used by tech companies to detect known child sexual abuse images.
A video conferencing tool mentioned as a platform facilitating the spread of CSAM, particularly through live streams of abuse. Prosecutors have referred to it as the 'Netflix of child pornography'.
Legal protections for online platforms that shield them from liability for content posted by users. The Earn It Act proposes removing these protections for child pornography.
A bipartisan bill passed unanimously in 2008 (re-authorized in 2012) aimed at confronting child sexual abuse material. Many of its major provisions, such as funding allocation and bi-annual reports, were not fulfilled by the federal government.
A bill introduced by Lindsey Graham and Richard Blumenthal in Congress. If passed, it would remove Section 230 protections for companies concerning child pornography, potentially increasing their liability.
More from Sam Harris
View all 278 summaries
13 minThe Permission to Hate Jews Has Never Been This Open
24 minThe DEEP VZN Scandal: How Good Intentions Nearly Ended the World
10 minThe War Was Necessary. The Way Trump Did It Wasn’t.
1 minBen Shapiro Knows Better
Found this useful? Build your knowledge library
Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.
Try Summify free