Facebook fails to notify users they’re a victim of revenge porn

0

Facebook’s AI is getting better at identifying revenge porn, but the company still falls short when it comes to how it treats victims of the crime. 

The social media giant doesn’t plan to notify victims if an intimate image of them has been shared nonconsensually on the site, according to the Daily Dot. 

Instead, the offending images are simply reviewed and removed by a specially trained team of Facebook employees, while the victim largely remains in the dark.

In 2017, Facebook started using photo-matching technology to crack down on revenge porn. 

After its reviewers detect revenge porn on the site, they label it with a special hash, or ‘digital fingerprint,’ to make sure that it cannot be shared again, a Facebook spokesperson told the Daily Dot.

Then, in March, Facebook launched new AI and machine learning tools to detect intimate photos and videos posted without the subject’s consent. 

Once the images are flagged, they’re sent to human moderators for further review.

‘By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,’ Antigone Davis, Facebook’s global head of safety, wrote in a blog post announcing the new tools. 

‘This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared.’ 

These features build on a previous pilot program, launched in 2018, wherein Facebook began letting users ‘securely’ send in naked photos they fear might be posted publicly. 

After creating a hash for the image, Facebook said it would delete the images from its servers. 

‘This program gives people an emergency option to securely and proactively submit a photo to Facebook,’ Davis explained. 

At the time, Facebook said it had received ‘positive feedback’ from victims and advocacy organizations, so it was expanding the pilot program to more areas.  

But by failing to notify users if they’ve been a victim of revenge porn, Facebook may not be doing all it can to tackle the issue.

‘I understand why Facebook wouldn’t want to be involved in proactively reaching out to victims,’ Kelsey Bressler, a member of Badass Army, a non-profit helping victims of revenge porn, told the Daily Dot.

Doing so would require a modicum of ‘bedside manner,’ she added. 

Facebook also may not want to notify users as they believe it could shield them from the potential emotional distress of learning someone attempted to upload one of their intimate images to the site, Gizmodo noted. 

However, with users trusting the site to handle their nude images, many believe it’s only fair that it notify them when the offending photo is detected. 

Facebook could create an opt-in feature where users can elect to be notified when a non-consensually uploaded intimate photo is detected, that way they wouldn’t be forced to receive those updates automatically, Gizmodo said.

For now, it doesn’t appear that Facebook intends to roll out such a tool anytime soon.    

Share.

About Author

Leave A Reply