Meta said the aim is to help shield people from nude images or other unsolicited messages. As further protection, the company said it can’t view the images itself nor share them with third parties. “We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive,” a spokesperson said. It plans to share more details in the coming weeks ahead of any testing.
The new feature is akin to the “Hidden Words” tool launched last year, Meta added. That feature allows users to filter abusive message in DM requests based on key words. If a request contains any filter word you’ve chosen, it’s automatically placed in a hidden folder that you can choose to never open — though it’s not completely deleted.
The feature is welcome but long overdue, as unwanted nude photos were largely ignored by social media companies and are now a pervasive problem. One study back in 2020 by the University College London found that of 150 young people aged 12-18, 75.8 percent had been sent unsolicited nude images.
Sending unwanted nude photos, also known as “cyberflashing” has been targeted by multiple jurisdictions including California and the UK. In the UK, it could become a criminal offense if the Online Safety Bill is passed by parliament. California didn’t go quite that far, but last month, the state legislature and senate voted unanimously to allow users to sue over unsolicited nude photos and other sexually graphic material.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.