Facebook announced this week that it will expand its quest to solve for “revenge porn” — the act of sharing sexually explicit images of someone online without their consent — by asking users to voluntarily submit their own racy photos.
Sounds legit, right?
It actually could work, according to tech leaders and activists who argue that in order to combat unauthorized sharing of photos, there needs to be a tech-driven solution.
“It’s demeaning and devastating when someone’s intimate images are shared without their permission, and we want to do everything we can to help victims of this abuse,” said Antigone Davis, global head of safety for Facebook, in a post on the site.
Last year, the social media giant released a set of tools that, on the back end, uses photo-matching technology to stop an intimate image or video from being posted again once it’s already been reported and removed from Facebook-owned platforms, including Messenger and Instagram.
But that system is reactionary: an image has to already be published for all the world to see before it gets taken down.
“Even if these platforms are willing to take [the material] down, we know it can be very distressing for someone to have these images seen by their employer or their family members or peers,” says Erica Olsen, director of the Safety Net Project for the National Network to End Domestic Violence (NNEDV). “That reactive measure of taking them down after they’ve already been online — the damage is kind of done, in some way.”
In response, Facebook is taking a tip from survivors’ own proactive measures by using copyright law to block unauthorized images — in some cases, before they can even go up.
Olsen says that survivors have been known to take all the images or videos in their possession and copyright them. That way, if a photo or video is uploaded to a site that has a copyright firewall, the image is immediately flagged and removed. For sites that don’t have those protections in place, copyright law supersedes, and legally the images must be taken down.
With the new protocol, people can upload intimate images of themselves, privately, to ensure that they can’t be uploaded by anyone else. Facebook has partnered with several safety organizations, including NNEDV, who will facilitate the process in which users are provided a one-time link to submit any images of themselves that they do not want shared online. The images are reviewed and given a unique hashtag that can identify attempts to upload the same material in the future without having to store the original on Facebook’s servers.
In this way, sensitive and damaging photos and videos are preemptively flagged and blocked from ever being seen by the public at large.
Revenge porn has become a widespread problem. According to a 2016 study by the Data & Society Research Institute, more than 10 million Americans have either had someone threaten to post lewd images of them, or have been the victim of such images being shared online without their consent.
Amanda Lewandowski, a clinical teaching fellow at the Technology Law and Policy Clinic at New York University, has written extensively on the topic of how copyright law can be an effective — and proactive — solution to fighting revenge porn.
“Because an estimated 80 percent of revenge porn images are selfies, meaning that the subject and the photographer are one in the same, the vast majority of victims can use copyright law to protect themselves,” she says.
Olsen has seen the tactic work. “In many cases, survivors have learned that copyright laws [make it] a little bit easier to go after sites and get images taken down,” she says.
Now, with the roll out of Facebook’s new proactive self-reporting program, victims and survivors have one more tool in their arsenal to fight back against online harassment.
Facebook Takes on Revenge Porn
string(0) ""