Algorithmic Addiction Harms Children--Regulation Needed
TL;DR
- The pervasive use of social media algorithms, designed for continuous partial reinforcement, fosters individual and societal media addiction, diminishing human relationships and collective well-being.
- Tech companies' deliberate design choices to maximize engagement, even when aware of potential harm, necessitate regulatory intervention akin to historical efforts against tobacco and drunk driving.
- The insidious nature of algorithmic content amplification, particularly in AI-generated or user-shared media, poses a greater risk than traditional media by exposing children to harmful content without parental oversight.
- The erosion of literacy and critical thinking skills due to constant digital stimulation and short-form content adoption, exacerbated by AI, threatens the foundation of informed decision-making and societal progress.
- State-level legislative efforts to regulate AI and social media design are crucial due to federal inaction, facing significant resistance from tech companies employing deceptive lobbying tactics.
- The comparison of social media to mass media publishing highlights the need for accountability, as the First Amendment does not protect mass-mediated or advertiser-supported speech.
- Children's development of essential social and emotional skills is compromised by screen time, leading to deficits in eye contact, verbal communication, and real-world interaction.
Deep Dive
The movement to protect children from Big Tech, spearheaded by organizations like Mothers Against Media Addiction (MAMA), argues that current technological platforms pose a significant public health threat, akin to the dangers once posed by tobacco and drunk driving. This necessitates a comprehensive societal response involving parent education, school policy changes, and robust regulatory safeguards, as individual parental efforts alone are insufficient to counter the addictive design and pervasive harms of these technologies.
The core of the issue lies in the intentionally addictive algorithms employed by tech companies, which are designed to maximize engagement rather than child well-being. This leads to profound second-order implications: children are exposed to developmentally inappropriate content, including self-harm and pornographic material, often in private and without adult guidance, normalizing traumatic experiences and hindering crucial social and emotional development. The constant influx of information and the pursuit of dopamine hits through continuous partial reinforcement erode attention spans, diminish critical thinking skills, and displace real-world interactions essential for childhood development. This displacement is evident in observable deficits in social skills, such as eye contact and verbal communication, which become significant problems as children age. Furthermore, the rapid adoption of AI tools, while offering potential benefits, risks further eroding literacy and critical thinking, as children may rely on AI for answers rather than developing their own problem-solving abilities, leading to a societal decline in intellectual depth and an over-reliance on machines.
The fight against these harms is framed as a public health battle, drawing parallels to the successful campaigns against drunk driving and smoking. Just as society evolved to recognize and regulate the dangers of alcohol and tobacco, MAMA advocates for treating media addiction as a serious issue requiring systemic solutions. This involves legislative action, such as the Safer Kids Act in New York, which prohibits addictive algorithmic design and disruptive overnight notifications for users under 18, and the California AI bill, which promotes a duty of care for social media platforms. While outright bans on social media for children, as seen in Australia, are a strong measure, the immediate focus in the U.S. is on implementing safety regulations and demanding transparency from tech companies. The movement emphasizes that these regulations are not about censorship but about responsible design and preventing amplification of harmful content, distinguishing mass media publishing from protected free speech. The ultimate goal is to foster a "human-first future" where technology serves as a tool rather than a dominant force, preserving essential human experiences and relationships.
Action Items
- Audit social media platform design: Identify and flag 3-5 algorithmic amplification patterns that contribute to addictive usage or exposure to harmful content (ref: slot machine analogy).
- Create parent education materials: Develop 3-5 concise guides explaining the risks of specific social media features (e.g., infinite scroll, notification timing) and their impact on child development.
- Implement school phone policy: Draft a proposal for a school-wide ban on phones during instructional hours to foster in-person interaction and focus on learning.
- Track state legislative wins: Document 3-5 key policy changes enacted at the state level (e.g., New York's Safer Kids Act) to inform future advocacy efforts.
Key Quotes
"One mother texted me and said I don't know what to do I feel like my son is addicted to cocaine that I gave him and that is the helplessness that parents feel but it's not something that we can solve on an individual level right so you know we as parents have to feed our kids healthy meals but we don't expect every parent to keep the food supply safe like it's not your job to go test the baby formula at the drugstore right we have a system in place to make it safe and that's why we know these products because all tech is is products they need to to conform to certain standards that they're safe."
Julie Scelfo highlights the overwhelming helplessness parents experience when their children face severe issues like addiction, drawing an analogy to food safety. Scelfo argues that just as parents rely on systems to ensure food is safe, they should expect technology products to meet safety standards for children, implying a need for external regulation rather than individual parental control.
"I have no problem with companies making profit I have no company you know no problem with innovation we need innovation but to do it on the backs of kids is just gross and that's when I decided I needed to do something about this."
Julie Scelfo expresses her motivation for activism, stating that while she supports companies making profits and innovating, she finds it "gross" when this is achieved at the expense of children's well-being. Scelfo's decision to act stemmed from this ethical conflict, indicating a belief that profit should not supersede the safety and health of young users.
"You know when you have fomo all the time and you're spending every second wondering if you about your snap streak and how many likes you have um you're not really concentrating on reading and math and unfortunately we're seeing that in the nation's um report card."
Julie Scelfo points out the detrimental effect of constant social media engagement on children's academic focus. Scelfo explains that the pressure of "fomo" (fear of missing out) and the pursuit of social media metrics distract from essential learning activities like reading and math, which is reflected in declining educational outcomes.
"So calling it media addiction was a way of drawing attention to both the individual addiction that we have from algorithms that are designed like slot machines to give you continuous partial reinforcement and and get the dopamine going so that you have to come back but also as a way to begin to talk about our collective society and what kind of society do we want do we want a world where we let the machines and the apps and the screens you know dominate everything or do we want to preserve what I think is so special and unique about being human and that is our human relationships."
Julie Scelfo explains her choice to frame the issue as "media addiction," emphasizing its dual nature. Scelfo argues that this term highlights both the individual compulsive use driven by addictive algorithms and the broader societal question of whether technology should dominate human experience and relationships.
"The argument that they use or disingenuous but the big one is free speech. They keep using the free speech argument."
Julie Scelfo identifies "free speech" as a primary, though disingenuous, argument used by tech companies to resist regulation. Scelfo suggests that this argument is employed to deflect from the need for safety measures on social media platforms, implying that the companies' interpretation of free speech is self-serving.
"The intention of literacy in my opinion is one of the greatest achievements of human civilization and I think we often forget that the state of mass literacy hasn't been around that long it's only been a little more than a hundred years that we've had 80 of the population able to read at above a fifth grade level and that's something I would like to continue into the future and if we are not prioritizing a culture of literacy and it was the culture of literacy that gave us written laws it was the culture of literacy that allowed science and medicine and all of these wonderful things and what's been happening is we've been eroding that we've been eroding that with our visual culture we've been eroding that with short form content we've been eroding it with the diminishment of our attention spans and we know how you know the adult attention span is now like less than a goldfish like it's eight and a half seconds and and what that means people are reading less they're not taking the time to think through complex ideas that require reflection and depth and if we're not equipping our students to be able to do that that really scares me for the future."
Julie Scelfo expresses deep concern about the erosion of literacy and critical thinking skills due to the current media environment. Scelfo posits that the rise of visual culture, short-form content, and diminished attention spans are undermining the foundations of civilization built on literacy, posing a significant threat to future societal progress.
Resources
External Resources
Books
- "Amusing Ourselves to Death" by Neil Postman - Referenced as an example of cultural diminishment due to spectacle and soundbites, taken to an extreme by current media.
Articles & Papers
- "The Nightmare of Children's YouTube" (TED Talk) by James Bridle - Discussed as an example of how disturbing content can be accessed through algorithms, starting from seemingly innocuous videos.
People
- Neil Postman - Author whose work on cultural diminishment is referenced.
- James Bridle - Creator of a TED Talk about the dangers of children's YouTube content.
- Tipper Gore - Mentioned in relation to past concerns about media content and its effects.
- Joe Camel - Referenced as an example of Big Tobacco's marketing tactics to make smoking acceptable.
- Donald Trump - Mentioned for issuing an executive order regarding states' ability to regulate AI.
- Josh Hawley - Mentioned as an example of a conservative Republican supporting tech regulation.
- Mike Johnson - Mentioned for refusing to bring the Kids Online Safety Act to a vote in the House.
- Steve Scalise - Mentioned for stating that the Kids Online Safety Act could stifle speech.
- Rob Bonta - Attorney General sued by Meta to stop the implementation of the Age Appropriate Design Code.
- David Sachs - Mentioned as someone who might sue the White House regarding AI regulation.
- Dylan Villanueva - Part of the digital team responsible for filming and sharing episodes as videos.
Organizations & Institutions
- Mothers Against Media Addiction (MAMA) - Organization founded by Julie Scelfo to protect children from harmful technologies and media addiction.
- The New York Times - Former employer of Julie Scelfo.
- Meta - Company acknowledged for taking action on suicide and self-harm content, and for suing California to stop the Age Appropriate Design Code.
- Pro Football Focus (PFF) - Mentioned as a data source for player grading in a previous context.
- NFL (National Football League) - Mentioned as the primary subject of sports discussion in a previous context.
- New England Patriots - Mentioned as an example team for performance analysis in a previous context.
- Quince - Sponsor mentioned for their clothing products.
- Business History (Podcast) - New podcast by Robert Smith and Jacob Goldstein.
- Southwest Airlines - Featured in the first episode of "Business History" podcast.
- Australia - Country that implemented a social media ban for kids under 16.
- The Trevor Project - Organization that runs chat groups for youth.
- The Audre Lorde Project - Organization that runs chat groups for youth.
- The European Union (EU) - Mentioned in relation to GDPR.
- Wild Alaskan Company - Sponsor mentioned for seafood delivery.
- AARP - Organization offering resources for brain health.
- Crooked Media - Production company for the podcast.
- Pod Save America - Podcast by Crooked Media.
- The Trevor Project - Mentioned as an organization that runs chat groups for youth.
- The Audre Lorde Project - Mentioned as an organization that runs chat groups for youth.
- The Writers Guild of America East - Union representing the production staff.
- State Street Investment Management - Provider of ETFs.
- Alps Distributors Inc. - Distributor for State Street ETFs.
- BrainHealthMatters.com - Website for brain health information.
Websites & Online Resources
- quince.com/offline - Website for Quince with free shipping and 365-day returns.
- indacloud.co - Online dispensary for THC products.
- crooked.com/events - Website for ticket information for Pod Save America tour.
- crooked.com/books - Website for pre-ordering books from Crooked Media Reads.
- offline.crooked.com - Email address for comments, questions, or guest ideas.
- aarp.org - Website for brain health resources.
- wildalaskan.com/offline - Website for Wild Alaskan Company with a discount offer.
Podcasts & Audio
- Offline with Jon Favreau - Podcast where the discussion takes place.
- Planet Money - Previous show hosted by Robert Smith and Jacob Goldstein.
Other Resources
- AI (Artificial Intelligence) - Discussed in relation to its impact on children, homework, and societal behavior.
- Social Media - Discussed as a source of harm to children, addiction, and a public health threat.
- Media Addiction - Term used to describe the societal consumption of media and screens.
- Age Appropriate Design Code - Legislation requiring social media platforms to operate with a duty of care to children.
- Safer Kids Act / Stop Addictive Feed Exploitation Act - Legislation passed in New York to regulate addictive algorithmic design and overnight notifications from social media companies.
- Digital Choice Act - Legislation passed in Utah giving users control over their data and the right to delete.
- Kids Online Safety Act - Legislation that passed the Senate with broad bipartisan support but was not brought to a vote in the House.
- GDPR (General Data Protection Regulation) - Data privacy regulation in Europe.
- Myopia - Nearsightedness, discussed as a physical health issue linked to excessive screen time.
- ETF (Exchange Traded Fund) - Investment vehicle mentioned in relation to State Street.
- DIA (Dow Jones Industrial Average ETF) - ETF tracking the Dow Jones Industrial Average.
- MDY (Mid-Cap ETF) - Mid-cap ETF from State Street.