Instagram is developing a tool to prevent users from receiving unsolicited nude photos in their DMs

instagram is developing a tool capable of blocking unsolicited nude photos sent in a direct message (DM), a spokesperson for its parent company Objective you confirmed

Known as “Nudity Protection”, the feature would work by detecting a nude image and covering it up, before giving the user the option to open it or not.

More details are expected to be released in the coming weeks, but Instagram says it won’t be able to see the actual images or share them with third parties.

This was confirmed by Liz Fernandez, Product Communication Manager at Meta, who said it would help users “protect themselves from nude photos and other unwanted messages”.

She said The edge“We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive.”

Known as “Nudity Protection”, the feature would work by detecting a nude image and then filtering that inbox message.

It's still in the early stages of development, but hopefully it will help reduce incidents of

It’s still in the early stages of development, but hopefully it will help reduce incidents of “cyber flickering.” Cyber ​​flashing is when a person receives an unsolicited sexual image on their mobile device from an unknown person nearby (stock image)

News of the feature was first announced on Twitter by the leaker and mobile developer Alessandro Paluzzi.

He said “Instagram is working on nudity protection for chats” and posted a screenshot of what users can see when opening the feature.

It said: ‘Safely detect and cover nudity. Your device’s technology covers photos that may contain nudity in chats. Instagram can’t access photos.

‘Choose to display the photos or not. Photos will remain covered unless you choose to view them.

‘Get safety tips. Learn how to stay safe if you interact with sensitive photos.

‘Enable or disable at any time. Update in your settings.’

Liz Fernandez, Product Communication Manager at Meta, said the tool will help users

Liz Fernandez, Meta’s Product Communication Manager, said the tool will help users “protect themselves from nude photos and other unwanted messages.”

Ms Fernandez compared functionality to functionality

Ms Fernandez compared the feature to the “Words Hidden” feature on Instagram which was introduced last year.

Ms. Fernandez liked the functionality at ‘Hidden words‘ on Instagram which was introduced last year.

This allows users to automatically filter out messages containing words, phrases, and emojis they don’t want to see.

She also confirmed that nudity protection will be a voluntary feature that users can turn on and off as they see fit.

It’s still in the early stages of development, but hopefully it will help reduce incidents of “cyber flickering.”

Cyber ​​flashing occurs when a person receives an unsolicited sexual image on their mobile device from an unknown person nearby.

This can be done through social networks, messages or other sharing functions such as Airdrop or Bluetooth.

In March, UK ministers announced that men who send unsolicited 'd**k photos' will soon face up to two years in jail (stock image)

In March, UK ministers announced that men who send unsolicited ‘d**k photos’ will soon face up to two years in jail (stock image)

HOW WILL THE “NUDITY PREVENTION” TOOL WORK?

The new Nudity Prevention tool will work by detecting images that may contain nudity that have been sent to the user via chat.

It will automatically cover the image and the user can choose to show it or not when they open the message.

Instagram will not be able to access the photos and the user can turn the feature on or off at any time.

In March, it was announced that men who send unsolicited “d**k photos” soon face up to two years in prison.

Ministers have confirmed that laws prohibiting this behavior will be included in the government’s program Online security billwhich should be adopted in early 2023.

This ruling will apply to England and Wales, as cyber-flashing has been illegal in Scotland since 2010.

It came after a study by the UCL Institute of Education found that non-consensual image-sharing practices were “particularly widespread, and therefore normalized and accepted”.

The researchers interviewed 144 boys and girls aged 12 to 18 in focus groups, and another 336 in a digital image sharing survey.

Thirty-seven percent of the 122 girls surveyed had received an unwanted sexual photo or video online.

A shocking 75% of girls in the focus groups had also been given an explicit photo of male genitalia, with the majority “not asked for”.

Snapchat was the most commonly used platform for image-based sexual harassment, according to survey results.

But the reports on Snapchat were considered “useless” by young people because the images are automatically deleted.

Additionally, a study by YouGov found that four in ten millennial women have been given a photo of a man’s genitals without his consent.

Men who send unsolicited “*** pics” may be NARCISSISTS and usually expect to receive “something in return”

Men who send other people unsolicited pictures of their genitals are likely to be more narcissistic and sexist than those who don’t, psychologists have found.

Researchers at Pennsylvania State University interviewed more than a thousand men to compare the personalities and motivations of those who sent intimate images and those who did not.

Rather than for personal gratification, men who share images of their genitals usually do so in the hopes of arousing the recipient and retrieving images in return.

A small minority of participants said they sent private photos in an attempt to intentionally elicit a negative response from women.

The researchers conclude that the practice can neither be constructed as solely sexist nor as a positive sexual outlet.

Learn more here