Nude photos are a widespread problem on Instagram. However, developers are already working on a feature that could help solve the problem.
Researcher Alessandro Paluzzi previously posted a screenshot that shows that the “Nudity protection” technology “covers photos that may contain nudity in chat”. Thus, it gives users the choice of whether to view them or not. Meta confirmed to The Verge that the technology is in development.
Meta said its goal is to shield people from nude images or other unwanted messages. As additional protection, the company said it cannot view the images itself or share them with third parties.
“We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,” a spokesperson said.
The new feature is similar to the “Hidden Words” tool launched last year, the Meta spokesman added. It allows users to filter offensive messages in DM queries based on keywords. If a request contains any filter word you choose, it is automatically placed in a hidden folder that you can choose never to open – although it’s not deleted completely.
Nudity protection is long-awaited because unwanted photos have largely been ignored by social media companies and are now a pervasive problem.