(NEW YORK) — Facebook announced new steps to fight revenge porn on its platforms on Friday, including the option to “provide a photo proactively to Facebook,” so that it does not get shared more widely.
The company outlined multipronged strategies to deal with non-consensual pornographic content, and flag it before intimate material is reported on Facebook or Instagram, Antigone Davis, Facebook’s global head of safety, wrote in a statement announcing the initiative.
Remarkably, the company outlined an “emergency option to provide a photo proactively to Facebook, so it never gets shared on our platforms in the first place,” Davis wrote.
In the pilot program, a user could send a photo to themselves on Messenger, which a Facebook specialist would turn into a digital fingerprint. The fingerprint would then be stored in a database to check against potential future matches. This method is now being expanded over the coming months.
“We are thrilled to see the pilot expand to incorporate more women’s safety organizations around the world, as many of the requests that we receive are from victims who reside outside of the U.S.,” Holly Jacobs, the founder of Cyber Civil Rights Initiative (CCRI) said in Facebook’s statement announcing the program.
The social media giant is also using more conventional tactics to deal with revenge porn, such as using artificial intelligence to scout near-nude images or videos shared without the subject’s permission on both platforms.
“This means we can find this content before anyone reports it, which is important for two reasons: often, victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” Davis said.
After machine learning detects the image or video, a human being will evaluate it.
“If the image or video violates our Community Standards, we will remove it, and in most cases, we will also disable an account for sharing intimate content without permission,” David wrote. “We offer an appeals process if someone believes we’ve made a mistake.”
The company has come under fire in the past for its algorithms inaccurately flagging pornographic content. In 2016, it banned the iconic “Napalm Girl” photo by photojournalist Nick Ut, which won the Pulitzer Prize in 1972. It depicts a naked girl running down a highway outside Saigon during the Vietnam War, because napalm had burned off her clothing. The photo is titled The Terror of War. After international backlash, the company reinstated the photo.
The company is also launching an online hub called “Not Without My Consent” on the platform’s Safety Center to help victims respond after revenge porn is posted. It said it consulted with experts to develop the program.
The service will help victims begin the process to get the offensive material removed, and make it easier for victims to report when their intimate images were shared on the platform.
Facebook said its new initiatives were created in partnership with Britain’s Revenge Porn Helpline, the aforementioned U.S.’s CCRI, Pakistan’s Digital Rights Foundation, Brazil’s SaferNet and South Korean Professor Lee Ji-yeon.
Copyright © 2019, ABC Radio. All rights reserved.