YouTube will limit the amount of videos about fitness and weight that teens may see.
Platform will make sure algorithms don't continuously recommend comparable content to younger users even when it complies with rules.
YouTube has decided to cease recommending videos to teenagers that glorify certain fitness levels, body weights, or physical attributes, following warnings from experts about the potential harm of such content when viewed repeatedly.
While 13- to 17-year-olds will still have access to these videos, the platform’s algorithms will no longer lead young users into related content “rabbit holes” after they watch.
YouTube acknowledged that this type of content does not violate its guidelines, but it recognized that frequent exposure could negatively impact the well-being of some viewers.
Dr. Garth Graham, YouTube’s global head of health, stated, “As teenagers begin to form their identities and set personal standards, repeated exposure to content that promotes idealized norms can create unrealistic expectations, potentially leading to negative self-perceptions.”
Experts from YouTube’s youth and families advisory committee indicated that while certain video categories may seem harmless individually, they could become problematic with repeated viewing.
The newly implemented guidelines, which are now in effect in the UK and globally, target content that idealizes specific physical traits, such as beauty routines aimed at altering one’s appearance; promotes certain fitness levels or body weights, like exercise regimens that advocate for a particular physique; or fosters social aggression, including physical intimidation.
YouTube will refrain from making repeated recommendations on these subjects to logged-in teenagers who have verified their age. This safety initiative has already been rolled out in the United States.
Allison Briscoe-Smith, a clinician and YouTube advisor, remarked, “An increased exposure to content that glorifies unhealthy standards or behaviors can reinforce harmful messages, which may influence how some teenagers perceive themselves. Implementing ‘guardrails’ can assist teens in developing healthy habits as they naturally compare themselves to others and consider their own identities.”
The recently enacted Online Safety Act in the UK mandates that technology companies take measures to shield children from harmful content and evaluate how their algorithms might expose individuals under 18 to detrimental material. The legislation highlights the potential risks posed by algorithms that can inundate children with excessive content in a brief period, requiring companies to analyze any threats these algorithms may present to minors.
Sonia Livingstone, a social psychology professor at the London School of Economics, emphasized that a recent report from the Children’s Society charity highlights the necessity of addressing the effects of social media on self-esteem. According to the Good Childhood report, nearly 25% of girls in the UK expressed dissatisfaction with their appearance.
“There is at least an acknowledgment that modifying algorithms is a constructive step that platforms like YouTube can undertake,” Livingstone remarked. “This change will be especially advantageous for young individuals facing vulnerabilities and mental health challenges.”
Thank you for reading this post, don't forget to follow my whatsapp channel
Discover more from TechKelly
Subscribe to get the latest posts sent to your email.
Comments are closed.