How YouTube Can Harm Kids and Teens — Algorithms, Predators, Mental Health, and What Families Can Do

YouTube is the most widely used platform among U.S. teens, and a significant percentage report being on it “constantly,” according to Pew Research reporting covered by the Associated Press.

YouTube can be educational and inspiring—but it can also expose minors to harms such as:

  • Harmful recommendation loops (including self-harm and eating disorder content)
  • Bullying and harassment (comments, creators, peer content)
  • Grooming and exploitation risks (especially in comments and live contexts)
  • Sleep disruption and overuse

YouTube has implemented youth safety measures such as YouTube Kids, supervised experiences, child safety policies, and “take a break” defaults for teen users but even with these measures, some kids and families have still experienced serious harm.

When YouTube-related harm is severe, especially involving exploitation, coercion, or catastrophic mental health outcomes, parents may need a legal evaluation in addition to reporting and clinical support.

We seek justice for the harms done to children and families by addictive and harmful social media apps.

Why YouTube is uniquely powerful

YouTube is not just a social app. It’s:

  • A search engine for video
  • A recommendation machine
  • A constant stream of short-form content (Shorts) very similar to TikTok
  • A creator economy where teens form parasocial bonds
  • A community with comments, live chat, and DMs in some contexts all of which can have an effect on mental health and physical safety

YouTube remains the most popular platform among U.S. teens. At its huge scale and with its level of immersion YouTube can be both beneficial and extremely harmful.

The most common harms to youth on YouTube

Harmful content recommendations

Recommendation systems can push a teen from innocuous interest into harmful repetition: body comparison, disordered eating themes, “means” content, hate content, violence, etc.

In 2025, health experts and researchers have identified that YouTube’s recommendation engine can trap teens in loops of “suboptimal content” while burying positive material. Even when teens search for age-appropriate terms (e.g., Minecraft, memes), they may encounter high-risk recommendations.

YouTube has publicly acknowledged concerns about repetitive exposure and has tried, with only limited success to reduce some categories of recommendations for teens.

Eating disorder and self-harm content concerns

The Center for Countering Digital Hate published research alleging YouTube’s algorithm pushes eating disorder and self-harm content to teens, and major media covered those findings.

Recommended content can include harmful trends like “Anorexia Boot Camp” and “Thinspo”. Trends like “What I Eat in a Day” (#WIEIAD) may showcase extreme calorie deficits or “safe foods,” which could trigger disordered eating. Some studies suggest a percentage of recommended videos may contain self-harm or suicide-related content, and a notable percentage of girls report seeing such content monthly. Additionally, creators may use coded language or misspelled hashtags to bypass content moderation. 

Daily social media use is associated with negative feelings about body image among teens. Social media content has been linked to adolescents engaging in disordered eating behaviors. Reports suggest a rise in eating disorder hospitalization rates for teen girls, with social media mentioned in patient histories. Exposure to this type of content is also linked to increased risks of depression, anxiety, and suicidal thoughts.

Harassment and bullying

Cyberbullying on the platform can lead to serious mental health issues such as social anxiety, depression, and suicidal thoughts.

Harassment can happen through:

  • Public Comment Sections: Public comments are high-risk areas where teens may face repeated insults, slurs, or hostile interactions. These sections are often open and may not be immediately reviewed before a minor sees them unless the video is explicitly marked “made for kids”.
  • Targeted Abuse and Shaming: Risks include videos uploaded with the specific intent to shame, deceive, or insult a minor. This can involve mocking a teen’s physical traits, academic performance, or social status.
  • Doxxing and Privacy Violations: Malicious users may share or threaten to share a teen’s private personally identifiable information (PII), such as home addresses, school names, or phone numbers, to invite further harassment.
  • Brigading and Raiding: Teens may be targets of “brigading,” where an individual encourages coordinated abuse against them, or “raiding,” which involves directing malicious abuse through live stream chat.
  • Harmful Content Manipulation: Emerging risks in 2025 include the use of advanced AI tools to manipulate imagery for harassment, such as placing a teen’s face over a victim in a violent scene.

Grooming and exploitation risk

Reported grooming and exploitation risks for teens on YouTube include financially motivated sexual extortion through self-generated images or AI-generated deepfakes, information gathering from innocent pictures revealing personal details, building dependency with gifts, and eventually leading to control and sex extortion.

Predators on YouTube often leverage the platform’s public nature to identify and target vulnerable minors.

  • Targeting Child Influencers: Minors who post personal content like tutorials or “morning rituals” are high-risk targets. Predators may use public details about their lives to build a false sense of familiarity.
  • Sexualized Commenting: Comment sections on videos featuring teens are frequently used by predators to leave sexually explicit remarks or to encourage minors to move to private messaging apps like Discord or Snapchat.
  • “Love Bombing” and Incentives: Offenders may use excessive compliments and “love bombing” to build trust. They often offer digital rewards—such as gift cards, game credits, or upgraded app subscriptions—to entice teens into further interaction.
  • Isolation and Secrecy: A core grooming tactic is making the teen feel the relationship is a “secret” and isolating them from their family and real-world friends. 

Emerging Exploitation Risks in 2025

  • Financial Sextortion: This is one of the fastest-growing crimes in 2025. Offenders, often posing as peers, coerce teens (frequently boys) into sending explicit images and then immediately demand money under threat of public exposure.
  • Generative AI and Deepfakes: Reports in late 2025 indicate a massive spike in “AI-enabled harm”. Predators now use generative AI to create explicit “deepfake” images of minors using faces from their public social media or school postings to blackmail them.
  • Sadistic Exploitation Groups: The FBI warned in late 2025 about violent online groups that target vulnerable kids to push them toward self-harm or “sadistic” acts, often recording these incidents for a child exploitation “enterprise”.

Algorithms and repetitive exposure: why “one video” can become a problem

A single video about fitness or dieting might be harmless. The risk is the loop—what a teen sees next, and next, and next.

YouTube has described safeguards designed to reduce repetitive exposure for teens in certain categories.

Experiments in 2025 showed that scrolling while logged out can lead to recommendations for R-rated movie clips, fighting compilations, and instructions for making homemade weapons within one hour. Creators often use misspelled hashtags (e.g., using “0” for “O”) to bypass search blocks on harmful terms like “thinspiration,” allowing this content to remain in “Up Next” panels. 

The “autoplay” feature is is known to encourage binge-watching habits that children and teens struggle to break, often leading to sleep deprivation. Continued exposure to “asphyxiation games” or dangerous dares can normalize these acts for adolescents.

From a parent’s perspective, the takeaway is:

  • it’s not just “what did my teen watch once?”
  • it’s “what is the system feeding them over time?”

The “rabbit hole” problem

Some families experience a terrifying pattern:

  1. Their teen watches normal content
  2. The YouTube algorithm introduces increasingly extreme material
  3. Their teen becomes fixated, ashamed, or isolated
  4. A mental health worsens; self-harm risk increases

The Surgeon General’s advisory on social media and youth mental health emphasizes a need to reduce risk of harm and calls for multifaceted action.

Not all youth are affected equally. But for the teens who are vulnerable, these loops can matter.

Live streaming

Live content adds risk because it is real time and interactive.

Livestreaming introduces unique risks to teens by removing the safety net of pre-recorded editing and real-time moderation, creating an environment where impulsive behaviors and harmful interactions can occur instantly. 

Immediate and Unfiltered Risks

  • Impulsive Behaviors: Without the delay of editing, teens may act on impulse, engaging in dangerous challenges or sharing private information they might otherwise keep to themselves.
  • Real-Time Cyberbullying: Live comment sections allow for immediate, unedited feedback. Negative comments about appearance or hateful speech can instantly impact a teen’s mental health and self-image while they are still on camera.
  • Predatory Solicitation: Malicious actors use livestreams to target vulnerable minors, sometimes coercing them into performing harmful or sexual acts in real time.
  • Irreversible Mistakes: Once something is broadcast live, it cannot be taken back; viewers can record and reshare the stream even if the original is deleted. 

In 2025, Lifewire reported YouTube raised the minimum age to go live to 16, requiring adult presence for 13–15-year-olds appearing in streams (as described in that reporting).

YouTube’s child safety policy page states the platform doesn’t allow content that endangers minors and provides reporting guidance, but there are not effective technological safeguards in place to fully enforce the policies.

Gadget addicted friends using phones in park, lack of communication, ignoring

What should I do if my child was harmed by YouTube?

Prioritize Your Child’s Safety and Mental Health

Your child’s immediate safety and emotional stability should come first. If your child is experiencing distress, anxiety, depression, self‑harm ideation, or behavioral changes, seek professional mental health support as soon as possible. In severe cases, emergency care or hospitalization may be necessary. Early intervention not only supports recovery, but also helps document the seriousness of the harm your child experienced.

Preserve Evidence of the Harm

Before harmful content is removed or accounts are deleted, preserve as much evidence as possible. This may include:

  • Video URLs and titles
  • Channel names and usernames
  • Screenshots of comments, messages, or livestream chats
  • Watch history, recommendations, and notifications
  • Dates, times, and frequency of exposure

This evidence can be critical for investigators, mental health professionals, and attorneys evaluating whether the harm was foreseeable and preventable.

Report Harmful Content to YouTube

YouTube provides tools to report videos, channels, comments, and user behavior that violate its policies. While reporting does not always prevent harm, it creates a documented record showing that dangerous or exploitative content was present on the platform. Parents should retain confirmation emails or screenshots showing reports were submitted.

Report Exploitation or Enticement to Authorities

If the harm involved sexual exploitation, grooming, coercion, or sextortion:

These reports help protect other children and establish formal records of criminal conduct tied to online platforms.

Consult an Attorney When Harm Is Severe

When a child suffers serious injury and there are questions about platform responsibility, consulting legal counsel can help families understand their rights. An experienced attorney can assess whether YouTube’s design, algorithms, or safety failures played a role in exposing a child to foreseeable harm.

When YouTube Harm May Justify a Legal Evaluation

Not every harmful experience leads to a lawsuit, but a legal evaluation may be appropriate when injuries are severe or systemic failures are involved.

A legal evaluation may be warranted when:

  • Exploitation, coercion, or sextortion occurred
    This includes grooming, sexual exploitation, or threats tied to images, videos, or livestreams facilitated by the platform.
  • A child was targeted and harmed through foreseeable platform features
    Examples may include algorithmic recommendations, autoplay, livestream monetization, direct messaging, or inadequate age‑based safeguards.
  • Severe injury occurred
    This may include hospitalization, suicide attempts, self‑harm, eating disorders, or long‑term psychiatric injury linked to online exposure.
  • A family is navigating wrongful death
    In the most tragic cases, online harms may contribute to a child’s death, raising serious questions about platform accountability and safety obligations.

How Pritzker Hageman Can Help

Pritzker Hageman represents families nationwide in cases involving catastrophic injury and wrongful death, including harm connected to digital platforms and corporate misconduct. We understand the complex intersection of technology, child safety, and accountability.

  • Parents may bring legal claims on behalf of their injured children
  • Your consultation is free and confidential

If your child was seriously harmed and you are seeking answers, speaking with an experienced legal team can help you understand your options and next steps.

FAQs: what parents search about YouTube and kids

Is YouTube safe for kids?

It depends on age, supervision, and content controls. Common Sense Media rates YouTube as 13+ and notes “iffy stuff abounds.”
YouTube Kids is designed as a separate, simpler experience with parental controls.

Does YouTube recommend harmful content?

Despite steps YouTube has taken to limit recommendations in certain categories for teens, some research and reporting indicates that harmful recommendation patterns (including eating disorder/self-harm content) continues to be shown.

What are the warning signs that my teen is being harmed by YouTube?

Signs can include social withdrawal, sleep deprivation from late-night scrolling, increased anxiety or anger when asked to stop using the platform, and talking negatively about their body or appearance.

What new protections did YouTube add for teens in 2025?

Starting in late 2025, YouTube began using AI-driven age estimation to automatically enable protections for users under 18, regardless of the birthdate entered on the account. These protections include disabling personalized ads and limiting repeat recommendations of “body-focused” content.

Can parents sue YouTube for a teen’s mental health harm?

Yes. As of December 2025, thousands of individual lawsuits and over 1,000 cases consolidated into Multidistrict Litigation (MDL No. 3047) are pending against YouTube (Alphabet) and other platforms. These lawsuits typically allege that the platform’s addictive design directly contributed to: Eating disorders and body dysmorphia, self-harm or suicidal ideation and severe depression or anxiety requiring medical treatment.

1-888-377-8900 (Toll-Free) | attorneys@pritzerlaw.com

We are not paid unless you win. Submitting this form does not create an attorney-client relationship.
Attorney Eric Hageman
Awards & Recognition:

The Pritzker Hageman law firm and our attorneys have been recognized in:

U.S. News & World Report

Pritzker Hageman has been recognized as one of the best law firms for personal injury litigation by U.S. News & World Report every year the award has been given since 2012.

Super Lawyers®, Thomson Reuters

Attorneys at Pritzker Hageman have been awarded the peer selected Super Lawyers distinction every year since 2004.

America’s Top 100 Attorneys®

Lifetime Achievement selection to America’s Top 100 Attorneys®.

Three Time Attorneys of the Year

Pritzker Hageman lawyers have been named Attorneys Of The Year by Minnesota Lawyer three times.