Facebook Defends the Odd Way It Is Going About Tackling Revenge Porn 

Facebook Defends the Odd Way It Is Going About Tackling Revenge Porn
Share:

Facebook’s unusual solution to prevent revenge porn has raised a lot of questions. Revenge porn is the term used when intimate, nude, or sexual images are distributed without a person’s consent. Revenge porn has become an epidemic in Australia. According to a recent study, one in five Australians between the ages of 16 and 49 are affected by revenge porn.

Facebook has come up with an experimental program to fight it, but its controversial methodology has given users a lot of doubts. The company provided a clarification on their methods via a blogpost by Antigone Davis, the company’s Global Head of Safety. If a user thinks their images or videos could be a used by a malicious third party, they will have to alert the eSafety commission. To participate, the user will have to complete an online form on Australia’s eSafety Commissioner’s website. The user will be asked to send the images they want to block to themselves on Facebook’s Messenger. The commissioner’s office will notify Facebook, but Facebook won’t have access to these images. Once Facebook receives the notification, a representative can review and hash these images.

This has a necessary risk built in, but “it’s a risk we are trying to balance against the serious, real-world harm that occurs every day when people (mostly women) can’t stop NCII from being posted,” Facebook security chief Alex Stamos explained on Twitter. Stamos is using the abbreviated form of “non-consensual intimate image”—better known as revenge porn.

At the backend, whenever someone uploads pictures on Facebook, it checks them against the database of hashes. Hashes are like digital fingerprints that are unique for each photo. Facebook and other internet giants already use the technique to fight the online distribution of child pornography. They keep a database of hashes from known child porn images in an effort to block them.

“To prevent adversarial reporting, at this time we need to have humans review the images in a controlled, secure environment,” Stamos further explained on Twitter. “We are not asking random people to submit their nude photos. This is a test to provide some option to victims to take back control. The test will help us figure out how to best protect people on our products and elsewhere.”


Share:
Tanuja Thombre
Tanuja Thombre
A Soft Skills and Behavior Trainer by passion and profession, with 8 years of experience into Mortgage Banking sector. Currently I am working as a Training Consultant and I cater to the training needs across various industries. This also allows me to interact with, train and learn various aspects of human modes. Adorned with certifications from various institutes like Dale Carnegie & Steven Covey.I have a natural instinct for writing; every once a while, a Blog, a short article and in the future I plan to author a Book. When it comes to writing, I believe there is seldom anything as appealing as Simplicity.

    Similar Articles

    Additional Resources to Download

    Top