YouTube has announced that it will limit certain health and fitness video recommendations for teenagers, especially those that promote specific body types as “ideal.” While teenagers aged 13 to 17 can still search for and watch fitness-related content, YouTube will no longer encourage them to keep watching similar videos repeatedly. The platform is making this change because there are concerns that constantly seeing these videos can lead young people to feel negatively about themselves.

According to BBC, YouTube’s algorithm usually suggests videos related to what a user has just watched or shows similar videos in a sidebar. However, for teenagers, this will no longer happen with content that compares physical features, promotes certain body weights or fitness levels as better, or shows social aggression like intimidation or non-contact fights. These actions were taken after YouTube’s Youth and Families Advisory Committee found that teens are more vulnerable than adults to forming harmful beliefs about themselves when exposed to repeated messages about “ideal” standards online.

However, these restrictions will only work if teenagers are logged into YouTube accounts with accurate birthdates, as the platform currently has no way to verify users’ ages.

Experts, like Dr. Petya Eckler from the University of Strathclyde, who studies the impact of social media on body image, welcomed this move. She told BBC that there is a clear connection between young people’s use of social media and how they perceive their bodies.

However, Dr. Eckler emphasized that this should be part of a broader conversation within families about health, fitness, and the idea that exercise should enhance overall well-being, not just focus on appearance.

BBC also reported that YouTube is introducing new features to help parents monitor their children’s activities on the platform. Parents will be able to link their accounts with their teens, allowing them to see what videos their children upload, their subscriptions, and comments. They will also receive emails when their teens upload videos or start livestreams.

Earlier this year, UK media regulator Ofcom urged tech companies to adjust their algorithms to protect children from what it called “toxic” content.

This move by YouTube aligns with those concerns, aiming to create a safer environment for young users.

Credit : BBC

https://www.bbc.com/news/articles/c049kn7wlgxo

Leave a Reply

Your email address will not be published. Required fields are marked *