top of page

Meta CEO Mark Zuckerberg Denies Instagram Targets Children at Landmark Youth Social Media Trial

  • Writer:  Editorial Team
    Editorial Team
  • Feb 19
  • 4 min read
Meta CEO Mark Zuckerberg Denies Instagram Targets Children at Landmark Youth Social Media Trial

In a high-stakes civil trial unfolding in Los Angeles Superior Court, Meta Platforms CEO Mark Zuckerberg took the witness stand this week to vigorously deny that Instagram was engineered to attract or addict children — despite testimony and documents presented by plaintiffs alleging just that. The case, one of the first of its kind to reach a jury, has drawn intense public and legal scrutiny over the social media giant’s influence on youth mental health and its internal policies on underage use.

At the heart of the litigation is a lawsuit filed by a California woman identified as KGM, who claims she began using Instagram and YouTube at a young age and later developed serious mental health issues, including depression and suicidal thoughts, as a result of compulsive use. KGM, who was around nine years old when she first accessed these platforms, alleges that Meta and Alphabet’s Google designed these services in ways that hooked young users and worsened their psychological well-being.

During intense questioning by plaintiff attorney Mark Lanier, Zuckerberg maintained that Meta never set out to make Instagram addictive to children or to deliberately maximize the time young people spent on the platform. He reiterated that Instagram’s official rules prohibit children under 13 from creating accounts, but acknowledged the practical challenge of enforcing such age limits, noting that “a meaningful number of people lie about their age to use our services.”

Internal Documents Under Scrutiny

A major flashpoint in the trial has been internal Meta communications brought before jurors. Attorneys for the plaintiffs have shown emails and presentations suggesting that executives, including Zuckerberg, discussed ways to grow Instagram’s user base among younger demographics. For instance, a 2018 Instagram internal presentation featured language such as “If we want to win big with teens, we must bring them in as tweens,” implying a strategic interest in engaging pre-teen users. Meta’s legal team has disputed the interpretation of such materials, with Zuckerberg saying that critics are “mischaracterizing” what was meant by internal discussions.

The jury also saw a 2022 internal document listing milestones for increasing daily user engagement time on Instagram — from an average of 40 minutes in 2023 to a projected 46 minutes by 2026. Though plaintiffs’ lawyers argue these are explicit goals showing a focus on maximizing screen time, Zuckerberg countered that these figures served as “gut checks” for company leadership and were not intended as concrete targets. He said that the company’s priorities had shifted toward improving user experience rather than simply increasing time spent on the app.

Age Verification, Safety Tools, and Responsibility Debates

Throughout his testimony, Zuckerberg stressed the difficulty in verifying users’ ages online. He suggested the responsibility for accurate age verification may lie with mobile device makers, not just platform operators, highlighting the broader systemic challenges of policing youth access on digital services. Meta has also pointed to its investments in safety tools and policies intended to protect young people online, including discussions about creating a separate version of Instagram for children under 13 — an idea the company ultimately abandoned.

Plaintiffs’ attorneys argue that Meta’s efforts — even if substantial — fall short of addressing the real harm they claim these platforms have caused. Many families, school districts, and other plaintiffs have joined the consolidated case, which now includes more than 1,600 individual claims seeking accountability for youth mental health harms allegedly linked to social media use.

Industry Context and Broader Backlash

The trial is seen as a bellwether for a large wave of litigation targeting big tech companies over alleged youth harm. Similar lawsuits have been filed against other major platforms, including Snap, TikTok, and Google’s YouTube, though some settled their cases before trial. In this Los Angeles trial, YouTube was expected to be represented by its CEO Neal Mohan, but attorneys for the plaintiff ultimately chose not to call him as a witness.

Legal experts say the outcome of this case could have far-reaching implications. Unlike many tech liability suits that hinge on user-generated content (and are shielded by Section 230 protections), this lawsuit focuses on platform design choices and product features. A ruling against Meta could erode longstanding legal defenses enjoyed by tech companies, potentially opening the door to a new era of accountability for how digital products are built and marketed.

Meta’s Defense and Public Statements

Meta has consistently denied the central allegations, both in court and through spokesperson statements. The company argues that multiple factors — including preexisting mental health conditions, offline stressors, and “problematic use” of apps rather than genuine clinical addiction — better explain the plaintiffs’ experiences. Representatives for YouTube have also denied the specific youth addiction claims.

Zuckerberg’s testimony underscores the tension between corporate narratives about innovation and safety and the growing public and legal concern over social media’s impact on youth mental health. As the trial continues, jurors will have to weigh internal evidence against Meta’s defenses and broader arguments about responsibility and harm in the digital age.


Comments


bottom of page