Facebook's Imperfect Attempt at Preventing Revenge Porn Reveals the Dark Side of a Photo-Laden Future


[ad_1]

Facebook recently announced the launch of a new tool aimed at preventing cases of revenge porn, an act in which an ex posts intimate imagery of their former partner without permission in an attempt to humiliate or cause the person serious distress. While obviously well intentioned, the program has a major flaw and highlights a future where photos of every part of our lives will not always be a good thing.

I generally love how much photography has permeated our lives. I don’t mean the honking DSLR, hours in Photoshop type of photography, but simply the fact that I have a smartphone with a decent enough camera to document and share any moment of my life I care to. I’m just old enough to remember life without an omnipresent interconnectivity, and I can definitely see the difference in friendships from before the onset of that era and after. Most of my friends are out of state, and yet, we easily stay in touch and are up-to-date on the details of each other’s lives. 

Of course, I’m not saying anything groundbreaking by acknowledging the ease with which cell phones and computers connect people. Nonetheless, we live in a world where we trade privacy for all sorts of things and our actions are increasingly documented and broadcasted. In some ways, that’s a good thing: people are being held accountable for behavior that would have never seen the light of day decades ago; academic, professional, and social connections that would have once been impossible are now made with ease; and the ability to share knowledge and experiences has never been stronger. But there is, of course, a major downside to all this.

I don’t need to tell you that there are ill-intentioned people in the world, and with the aforementioned capabilities also comes the ability to do great harm. Thus is the objective of revenge porn. Before digital cameras and smartphones, we took less pictures and we had easier control over their distribution. Now, we take many orders of magnitude more images and once a single digital copy has left our control, it’s remarkably easy to broadcast it almost anywhere to an audience of almost any size. This enables the ability to do great damage to someone with ease, and we see that in all sorts of manifestations in this digital world, including revenge porn. The problem is even more complicated in that the victim may not always know they’re a victim: take the Marines secret Facebook group scandal for example.

Facebook’s solution to that specific problem is this: you send yourself any images you do not wish to be maliciously distributed. A Facebook employee reviews the images to ensure they actually violate the platform’s standards and policies, then flags them in the system, after which it generates a digital fingerprint that prevents the image from being uploaded to the platform. The images remain in the system as blurred versions and accessible to a “small number of people” for some time to ensure that the enforcement is being carried out correctly. Out of all this, the key thing to note is that at the beginning of the process, at least one person has to view the uncensored versions of the photos. Facebook already follows a similar protocol for preventing such content from being reposted to its platforms after it has been reported and removed, but this marks the first attempt at preventing such instances before they occur.

One of the biggest causes of damage when someone posts revenge porn is the fact that the target’s privacy and ability to control access to intimate parts of their life are breached. So, if someone is a victim of or feeling they’re in imminent danger of becoming a victim of revenge porn, it’s understandable that they may be hesitant to send the images to an unknown person and that doing so may do little to assuage the feeling of lost control and privacy. After all, this doesn’t offer the victim control over dissemination of privates images to strangers, it merely offers them control over which strangers the images go to. That’s not especially comforting.

So, Facebook’s solution is obviously very imperfect. The justification for the requirement that a human view the images from the company is that if this step is not in place, it would be exceedingly easy for someone to abuse the system to create inappropriate censorship, such as preventing normal images of a political candidate from making their way onto the platform. That’s a fair point, but the system is predicated upon the victim trusting the morality of Facebook employees and of the company itself, not to mention still surrendering their most intimate privacy to a stranger, just a different stranger than it would have been otherwise. When someone’s trust has been violated at the most fundamental level, those are some pretty hefty leaps to ask them to make. 

I’m no computer expert, and I don’t claim to have a solution to this; rather, I’m noting that I think it highlights a real danger of our image-laden future. There’s a tendency for the manifestation of negative consequences of technological advances to lag the novelty of their introduction, and we’ve often seen that we’re either mediocre at always anticipating all possible misuses of a technology or simply overconfident in our ability to prevent or control said misuses. And on a more philosophical level, I think it highlights a certain awful absurdity that has been brought to the fore: if you want to protect your most intimate privacy, you have to give it up to a corporate entity first. Surely, that cannot and should not be the ultimate solution to this. 

The unfortunate truth is that although the ability to take and share photos more easily than ever presents numerous positive opportunities and outcomes, it also enables extremely malicious and damaging behavior for which there isn’t yet an obvious solution. As photography’s role in our daily lives and society at large only continues to grow, I certainly hope our ability to stop its misuses grows with it.

Lead image by Tracy Le Blanc, used under Creative Commons.

[ad_2]

Original Source Link


Leave a Reply