1100 Walnut St, Kansas City, MO 64106, United States
1-913-273-2491

Facebook is expanding its controversial anti-revenge porn program to the US, UK, and Canada — Quartz

 

Last November, Australian media reported that Facebook was experimenting with a novel idea to combat revenge porn on its platforms. In a pilot program done in cooperation with the government, users would do something that feels ill advised: upload a nude image to Facebook. The idea was that by doing so, Facebook could preempt that specific image from being uploaded to the platform by someone else in the future through technology that can help identify duplicates.

The program sparked backlash among Facebook observers and internet users. One cybersecurity expert told Quartz at the time that it was a “horrible idea,” questioning the wisdom of giving such sensitive information to a third party—a sentiment hard to shake after the Cambridge Analytica scandal, which revealed just how unsafe user data was on Facebook. Despite the criticism, on May 22, Facebook announced that it was expanding the tests to the US, UK, and Canada.

A company spokesperson told Quartz it was encouraged by feedback from security, privacy and women’s safety experts in Australia.

Facebook is changing how the system functions, perhaps allaying some of the worries. Previously, users had to upload the images to Messenger to be reviewed by Facebook, raising concerns of such sensitive images being on the platform. Now, Facebook sends a user an email with a “secure link” to which they upload the image and send to the social network’s servers.

In order to launch the process, the user reaches out to Facebook’s partner organization in their home country—the Office of the eSafety Commissioner in Australia, the Cyber Civil Rights Initiative and the National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline, or YWCA Canada. They provide additional support for users, including counseling and legal advice, the spokesperson said. Since the program is still being tested, the organizations also collect feedback from users.

After being uploaded, the images are reviewed by one of the “specially trained” members of Facebook’s team—a step that has also raised eyebrows, since it means handing over intimate photos to a stranger. After that, the images are “hashed,” or given a unique signature that allows Facebook to detect if the same image is ever uploaded. The company says it deletes the image from its servers within seven days, keeping only the hash on file.

While some civil rights experts have endorsed the idea—most notably the leading anti-revenge porn organization in the US, the Cyber Civil Rights Initiative—users will understandably be hesitant. In comments under the post announcing the expansion of the pilot, some asked why Facebook doesn’t create an app that would let users hash images on their own, and then just upload the hashes to Facebook.

When Quartz raised the idea with Facebook, the spokesperson said it could lead to abuse, with people hashing any image that they didn’t want on the platform, giving the example of a politician trying to prevent a compromising photo from being published. When Quartz suggested that Facebook has the technology to detect nude images, the spokesperson said the company’s anti-revenge porn policy is stricter than its community standards, meaning that it includes “near nudity” which the AI systems are not trained to detect.

Source link

Related Posts

Leave a comment