もっと詳しく

Unsolicited nude photos are a massive problem on social media, but Instagram is reportedly working on a tool that could help. An early screengrab tweeted by researcher Alessandro Paluzzi indicates that "Nudity protection" technology "covers photos that may contain nudity in chat," giving users the option to view them or not. Instagram parent Meta confirmed to The Verge that it's in development. 

Meta said the aim is to help shield people from nude images or other unsolicited messages. As further protection, the company said it can't view the images itself nor share them with third parties. "We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive," a spokesperson said. It plans to share more details in the coming weeks ahead of any testing.

The new feature is akin to the "Hidden Words" tool launched last year, Meta added. That feature allows users to filter abusive message in DM requests based on key words. If a request contains any filter word you've chosen, it's automatically placed in a hidden folder that you can choose to never open — though it's not completely deleted. 

The feature is welcome but long overdue, as unwanted nude photos were largely ignored by social media companies and are now a pervasive problem. One study back in 2020 by the University College London found that of 150 young people aged 12-18, 75.8 percent had been sent unsolicited nude images.

Sending unwanted nude photos, also known as "cyberflashing" has been targeted by multiple jurisdictions including California and the UK. In the UK, it could become a criminal offense if the Online Safety Bill is passed by parliament. California didn't go quite that far, but last month, the state legislature and senate voted unanimously to allow users to sue over unsolicited nude photos and other sexually graphic material.