YouTube to stop repeatedly recommending certain types of content to teen users
By Clare Duffy, CNN
New York (CNN) — YouTube is implementing new safeguards that could help prevent the platform from sending teen users down potentially harmful content rabbit holes.
The platform plans to limit repeated content recommendations for videos on certain topics, including content that idealizes certain body weights, James Beser, director of product management for YouTube Kids and Youth, said in a blog post Thursday. The change came out of work with YouTube’s advisory board of third-party youth wellness experts and the realization that certain categories of content “may be innocuous as a single video, but could be problematic for some teens if viewed in repetition,” Beser said.
The change to YouTube’s recommendation system for teen users comes as part of a wider update to the platform’s youth safety efforts, which also includes making “take a break” reminders and information about crisis resources more prominent.
Social media platforms have faced increased scrutiny for their effects on the mental health of users, especially young people. In 2021, lawmakers called out Instagram and YouTube for promoting accounts featuring content depicting extreme weight loss and dieting to young users. Earlier this year, YouTube rolled out changes to its policies on eating disorder content, adding prohibitions on certain types of videos on the topic and limiting others to be viewable by only adult users.
YouTube in recent years has also updated how it handles misinformation about medical issues such as vaccines and abortion.
YouTube says that videos in the following categories will not be repeatedly recommended to teen users: “content that compares physical features and idealizes some types over others, idealizes specific fitness levels or body weights, or displays social aggression in the form of non-contact fights and intimidation,” Beser said.
“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages — and those messages can impact how some teens see themselves,” Allison Briscoe-Smith, a clinician and researcher and member of YouTube’s Youth and Families Advisory Committee said in a statement. “Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and size up how they want to show up in the world.”
As with many social media policies, the challenge often isn’t introducing new rules but enforcing them. YouTube said the recommendation limits will go into effect in the United States on Wednesday, with additional countries added next year.
YouTube’s “take a break” and “bedtime” reminders, which were introduced in 2018 and are already on by default for teen users, will now appear as “full-screen takeovers” on both YouTube Shorts and long-form videos. The reminders will be set to pop up every hour as a default for teen users, although parents can update their frequency.
The platform is also making its crisis resource panels — which include, for example, suicide lifeline contact information — full screen when users search for topics “related to suicide, self-harm, and eating disorders,” Beser said. The resource panels will be shown to users of all ages, and will also include suggestions for more positive search terms, such as “self-compassion” and “grounding exercises.”
YouTube said it is also rolling out guidelines for parents and teens about how to safely create content online.
The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.