Meta plans to launch a new safety tool to stop kids from getting and sending nude pictures, even in private chats. This tool might be optional for adults too, on Instagram and Facebook. It’s in response to concerns that making chats private could hide child abuse. Meta says it’s about protecting users, especially women and teens, from inappropriate images or pressure to send them. They’re also making it so minors can’t get messages from strangers by default.

Critics worry that by making chats private, Meta won’t be able to find and stop child abuse. Some say they should use a method called client-side scanning to catch bad stuff before it’s sent. But Meta says their new tool won’t do that because it undermines privacy. Instead, it’ll use machine learning to spot nudity, all happening on your device. They say finding child abuse this way is too tricky and could wrongly accuse people.

Meta’s adding more safety features too, like making it so teens can only get messages from people they know, and letting parents control some settings. This news is a big deal because it shows how tech companies are trying to balance privacy and safety, especially for kids.

(Based on information from BBC)

https://www.bbc.com/news/technology-68093343

Leave a Reply

Your email address will not be published. Required fields are marked *