Let’s talk Taylor, deepfake porn, and the queer community

Call me backward, but my true introduction to the term “deepfake porn” came in January with the unsettling news of Taylor Swift becoming a victim of it. She emerged as the focus of what’s called, “image-based abuse” in January, where sexually explicit content is created using AI to realistically superimpose people’s faces onto bodies not their own. I was curious about it and with easy browsing, it didn’t take me long to find the repulsive internet images.

Where there’s a will, there’s a way to deepfake anybody

Of course, I’ve since learned that deepfake porn has been around for a number of years – but I’d say that many of us didn’t pay a lot of attention until Taylor was targeted. (I’m curious how long you might have known about it – or, like me, this was your first real recognition of it?)

As horrific as it is – I have to suggest that there may be an advantage to the fact that a major global celebrity got slammed with this online assault – because it’s bringing widespread attention to the issue. Taylor, being subjected to this online abuse, has caused a massive public uproar owed to her immense popularity. As a result, there’s been high profile attention emphasizing the trauma and violations of privacy that victims have to endure. The spread of these images has further sparked talk about the need for new laws to explicitly criminalize the creation and dissemination of deepfake porn. 

And I say, “good luck with that.”

The Dark Side will always find a way so what do we do? 

If you ask me, enacting laws to put a stop to this perverse activity is futile. I’m no expert on AI, but following the developments and trends and using it myself, I believe we’re headed to the point of no return. The culprits lurking in this dark side of AI will outsmart any safeguards we (try to) put into place to protect individuals’ rights and dignity in this digital age. Of course having laws on the books to keep “honest people honest” is a practical move – but I believe it will do little to stop the creeps who traffic in it.

Yes, we can also be outraged and traumatized – but where will that get us?

This dilemma reminds me of two examples that can support us in our thinking of this issue: 

Our Measured Reaction is Vital

When toddlers lose their balance and take a tumble, the caregivers, of course, should immediately attend to the situation. But to be supportive and soothing, it’s suggested that the caregivers stay low-key and avoid making a scene with loud upset and panic over the kid’s fall. If they stir up fear, the child, who otherwise may not have given her tumble a second thought, could be prompted to believe something is terribly wrong and plug into that drama. Hence, you have a small child scared out of their wits and screaming at the top of their lungs because of the panicked reactions around her.

I equate this with our public reactions to people (mostly women) who have been “deepfaked.” Reactions like, “What happened to you is so awful!” or, “You must be so ashamed!” or even, “I’m so sorry for you!”

Given what I just said about the panic from the caregivers of a toddler, does this reaction sound familiar? It may not be surprising that the recipient of the “sympathizing” comments could react like the panicked toddler.

Since increasing numbers of women are going to find themselves in this deepfake situation, I’d suggest a more measured and practical response would do us all a world of good. How about something like, “Well, this is happening to a lot of people – you’re not the only one. Remember, it’s just your face applied to whatever body and activity these creeps decide to add.”

This may be heresy, but in considering solutions, there’s a part of me that wants to normalize the experience with all of us being victims of deepfake porn. If a new measured and practical “normal” replaces the hysteria currently around this phenomenon, then the societal hue and cry evaporates – and there’s no need for any of us to carry shame for being the subject of deepfake porn. We’re all in it together… it happened to you, it happened to me, it could happen to anybody… and so what?

Which brings me to our second instructive example:

This example comes from what the gay community did about being called “queer.”  For many years, it was a slur aimed at dehumanizing and marginalizing gay people. But in the late 1990’s, activists and communities started to reclaim the word “queer” as a way to challenge societal norms and the stigmatization of their sexuality. But it wasn’t just about taking back a word – it was about rejecting the shame imposed by society, and in a cultural turn-about, asserting dignity, identity, and pride as a “queer” person. Today, “queer” is seen as a term of inclusivity and resistance – encompassing a broad spectrum of sexual orientations and gender identities.

In reclaiming “queer,” I’d suggest there’s a powerful message for us in regard to deepfake porn – what the queer community achieved serves as a remarkable example of a significant transformation of a word meant to disrespect and foster shame.

Of course, with deepfake, we’re not talking about simply a word. We’re talking about the experience of a woman seeing a visual of herself that she perceives as deeply degrading and humiliating. The challenge before us then is to do a 180° to change the perception of these images in order to reframe the experience for the “victims.”

How to do this is a deeply contemplative question. 

But being able to see the instigators of deepfake porn for who they are might be a start. Experts in the psychology of those creating deepfake porn, point to men, usually young men, who are socially isolated, struggle with forming connections and unable to effectively engage with the opposite sex or the dating scene. 

These people may never be found and held responsible for their actions, but the rest of us can be aware that these are the actions of sick individuals who should be pitied and marginalized by ignoring their very public cries for attention.

Again, I don’t expect deepfake porn to go away; on the contrary, it may well proliferate despite our most earnest efforts in shaping public policy. But what we can alter is our perception of it.

Pornography will remain accessible, and even if someone puts our face on an imaginary body, it’s important to remember that it remains merely an image. With the right perspective, such an image can be seen as harmless if we choose to view it that way.

“Sticks and stones may break my bones but words will never hurt me.”*

With Love,

Becca 

P.S. How does this situation strike you? Let me know what you think of what I’ve said as a possible solution.

*P.S.S. “Sticks and Stones” is an English-language children’s rhyme. The rhyme is used as a defense against name-calling and verbal bullying intended to increase resiliency, avoid physical retaliation, and/or to remain calm and indifferent.

2 thoughts on “Let’s talk Taylor, deepfake porn, and the queer community”

  1. Hi Becca, I couldn’t agree more. There needs to be a space where we can acknowledge harm and dark intentions and then be supported in pushing back against shame and victim identities that do not serve us. With all the craziness and insanity in the world, utilizing strategies that help us build stamina and take action are more important than ever. I don’t follow Taylor Swift but she’s clearly a force and in a strategic position to use her power and influence to give voice to truth, or maybe build a movement that will help people move forward with integrity. I think we are all being called to take a stand on what is right and what will heal.

    Reply
  2. Tori, the “utilizing strategies that help us build stamina” really resonates for me. Here’s one, outrageous as it may sound – we each take it upon ourselves to deepfake ourself before anyone ever does it to us. This would be mighty powerful for younger women … it’d be a type of exposure therapy that might effectively mute the pervasive efforts of the creeps.
    So, for example, whenever a particular woman is the victim of image-based abuse, untold others could deepfake themselves in solidarity, flooding the internet with these images. In no time at all, millions of images would make the reprehensible actions irrelevant! That definitely would build strength and stamina among women me thinks.

    Reply

Leave a Comment