Skip to Content

Facebook anti-revenge porn test “remembers” users’ nude photos

Facebook is testing a system to combat revenge porn that involves users sending their intimate photos to themselves so the company can recognize and block the photos if they’re ever posted without consent.

The social media company is partnering with the Australian government’s Office of the eSafety Commissioner for the pilot test, the Australia Broadcasting Corporation (ABC) reported.

Users who fear their nude photos could end up on Facebook or Instagram can contact the office, which may then direct the user to send a copy of the photo to themself on Facebook Messenger, BBC News reports. Facebook would then “hash” the image, or create a digital footprint that could be used to identify the image if someone else tries to post it.

Under the system, Facebook would not store the image itself, which the company says protects the photos from possible hackers.

“They’re not storing the image, they’re storing the link and using artificial intelligence and other photo-matching technologies,” said Julie Inman Grant the commissioner of the e-Safety office.

The ABC reports one in five women in Australia between the ages of 18 and 45 have been victims of “image-based abuse.”

Professor Clare McGlynn at Durham Law School in the U.K. called the test “an innovative experiment” but said it doesn’t go far enough to address the issue of abuse on the platform.

“I welcome Facebook taking steps to tackle this issue, as it has often been very slow to act in the past,” she told BBC News. “However, this approach is only ever going to work for a few people and when we think of the vast number of nudes taken and shared each day, this clearly isn’t a solution.”

Facebook was at the center of a scandal in March involving members of the U.S. military who shared nude photos of fellow service members in private Facebook groups without their consent. In April, the company rolled out new tools to combat revenge porn, including the use of artificial intelligence and image-recognition tools to detect and remove nonconsensual images.

Article Topic Follows: News

Jump to comments ↓

KION546 News Team

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content