WTF?
Facebook Workers, Not an Algorithm, Will Look at Volunteered Nude Photos to Stop Revenge Porn
Reports this week said a Facebook pilot program would let users volunteer nudes to an algorithm to stop revenge porn—but those nudes will be viewed by a human at the company first.
This week, multiple outlets reported on a Facebook pilot scheme that aims to combat revenge porn. In the program, users would send a message to themselves containing their nude images, which Facebook will then make a fingerprint of, and stop others from uploading similar or identical pictures.
The approach has many similarities with how Silicon Valley companies tackle child abuse material, but with a key difference—there is no already-established database of non-consensual pornography.
According to a Facebook spokesperson, Facebook workers will have to review full, uncensored versions of nude images first, volunteered by the user, to determine if malicious posts by other users qualify as revenge porn.
Tackling the pervasive problem of revenge porn without someone examining the images—while also making sure the system is not used to pull down legitimate images—is a difficult task. But the manual nature of Facebook’s new processes is still something users of the world’s biggest social network may want to be aware of before sharing nude pictures in order to stop a revenge porn attack.
As The Guardian reported on Tuesday, the process for Facebook’s Australia-focused pilot starts when a user completes an online form with the local government’s e-safety commissioner. The user then sends the images they would like flagged to themselves on Facebook messenger, and an analyst will open the image and create a fingerprint of it. That fingerprint is what the company will use to track any similar images in the future.
What that and other explanations do not necessarily make clear, however, is that prior to making that fingerprint, a worker from Facebook’s community operations team will actually look at the uncensored image itself to make sure it really is violating Facebook’s policies. A Facebook spokesperson described the process on background, meaning The Daily Beast cannot name or directly quote them.
Facebook will keep hold of these images for a period of time to make sure that the company is correctly enforcing those policies. Here, images will be blurred and only available to a small number of people, according to the Facebook spokesperson. An individual employee at Facebook, however, will have at that point already examined the un-blurred versions.
# # #
No comments:
Post a Comment