Instagram wants to block nude photos in DM

The social network Instagram is working on a way to protect users from receiving unsolicited nude photos in their messages directs. This was indicated by a spokesperson for the Meta group (parent company of Instagram) to the site The Verge, stressing that such protection is still under development.

We are working closely with experts to ensure that these new features preserve the people’s privacywhile giving them control over the messages they receive “, added the spokeswoman for Meta.

The technology that will be put in place will therefore not allow Meta to consult real private messages, nor to share them with third parties. No more details at this stage on what looks like a filter in the manner of protection against abusive direct messages.

Against cyberflashing

In this case, direct message requests containing offensive words, expressions or even emojis are automatically filtered out. The operation is similar to that of the comment filters to hide offensive comments.

For protection against unsolicited nude photos in direct messages, an adaptation will be necessary. The Verge underlines in any case a necessity in the face of cyberflashing presented as sending unsolicited sexual messages to strangers, often women. Photos may be of genitals.

According to a report by the British NGO Center for Countering Digital Hate on more than 8,700 direct messages to five high-profile women on Instagram, the platform failed to act in 90% of cases of abuse.

A small idea of ​​the tool

Reverse engineering specialist, the mobile developer Alessandro Paluzzi had shared on Twitter an image of the new protection tool Instagram is working on (above).

An optional character appears and with the possibility of viewing hidden photos or not. ” The technology of your device covers photos that may contain nudity in chats. Instagram can’t access photos “, can we read.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.