A child operates a smartphone displaying the YouTube app.

YouTube is implementing new safeguards that could help prevent teens from sending users down rabbit holes of potentially harmful content.


The platform plans to limit repeat recommendations for videos with certain content, including those that idealize certain body weights, James Beser, director of product management for YouTube Kids and Youth, said in a blog post Thursday. The change was made with YouTube's advisory board of outside youth health experts, Beser said, adding that certain types of content "may be innocuous as a single video, but may be problematic for some teens if viewed repeatedly."


The change to YouTube's recommendation system for teen users comes as a broader update to the platform's youth safety efforts; It also includes "take a break" notices and making information about crisis sources more prominent.


Social media platforms are increasingly scrutinized for their effects on the mental health of users, particularly young people. In 2021, lawmakers asked Instagram and YouTube to promote accounts with content depicting young people losing weight and dieting. Earlier this year, YouTube made changes to its policies on eating disorder content; Bans are placed on certain types of content-related videos and others are restricted to adult users only.


In recent years, YouTube has also updated how it handles misinformation about medical issues like vaccines and abortion.


YouTube says it won't repeatedly recommend videos in the following categories to teen users: "Content that compares physical features, exemplifies certain categories over others, exemplifies specific fitness levels or body weights, or shows social aggression in the form of non-contact combat or threats," Beser said.


"The increased frequency of idealizing unhealthy norms and behaviors can emphasize potentially problematic messages — and those messages can affect how some teens see themselves," said Allison Briscoe-Smith, a clinician and researcher. said Allison Briscoe-Smith, a member of YouTube's Youth and Family Advisory Committee. Statement. "Guardrails can help teens maintain healthy stereotypes by naturally comparing them to others and sizing up how they want to appear in the world."


As with many social media policies, The challenge is often not introducing new rules, but enforcing them. YouTube said the recommendation restrictions will go into effect in the United States on Wednesday and will add more countries next year.


YouTube's "take a break" and "bedtime" reminders, introduced in 2018 and already enabled by default for teen users, will now appear as "full-screen prompts" on both YouTube Shorts and long-form videos. Parents can update their frequency, but reminders will appear every hour by default for teen users.


The platform is making crisis resource panels that include suicide contact information, for example — full screen when users "search for topics related to suicide, self-harm and eating disorders," Beser said. Resource panels will be displayed to users of all ages; It will also include suggestions for positive search terms such as "self-compassion" and "core exercises."


YouTube said it is also issuing guidelines for parents and teens on how to safely create content online.