Home Learn Dear Taylor Swift, we’re sorry about those explicit deepfakes

Dear Taylor Swift, we’re sorry about those explicit deepfakes

0
Dear Taylor Swift, we’re sorry about those explicit deepfakes

Hi, Taylor.

I can only imagine how you could be feeling after sexually explicit deepfake videos of you went viral on X. Disgusted. Distressed, perhaps. Humiliated, even. 

I’m really sorry this is occurring to you. No one deserves to have their image exploited like that. But should you aren’t already, I’m asking you to be furious. 

Furious that this is occurring to you and so many other women and marginalized people around the globe. Furious that our current laws are woefully inept at protecting us from violations like this. Furious that men (because let’s face it, it’s mostly men doing this) can violate us in such an intimate way and walk away unscathed and unidentified. Furious that the businesses that enable this material to be created and shared widely face no consequences either, and may profit off such a horrendous use of their technology. 

Deepfake porn has been around for years, but its latest incarnation is its worst one yet. Generative AI has made it ridiculously easy and low-cost to create realistic deepfakes. And nearly all deepfakes are made for porn. Just one image plucked off social media is sufficient to generate something passable. Anyone who has ever posted or had a photograph published of them online is a sitting duck. 

First, the bad news. In the intervening time, we’ve no good ways to fight this. I just published a story  3 ways we are able to combat nonconsensual deepfake porn, which include watermarks and data-poisoning tools. But the truth is that there isn’t any neat technical fix for this problem. The fixes we do have are still experimental and haven’t been adopted widely by the tech sector, which limits their power. 

The tech sector has up to now been unwilling or unmotivated to make changes that might prevent such material from being created with their tools or shared on their platforms. That’s the reason we want regulation. 

Individuals with power, like yourself, can fight with money and lawyers. But low-income women, women of color, women fleeing abusive partners, women journalists, and even children are all seeing their likeness stolen and pornified, with no technique to seek justice or support. Any certainly one of your fans could possibly be hurt by this development. 

The excellent news is that the proven fact that this happened to you means politicians within the US are listening. You may have a rare opportunity, and momentum, to push through real, actionable change. 

I do know you fight for what is correct and aren’t afraid to talk up once you see injustice. There can be intense lobbying against any rules that might affect tech firms. But you will have a platform and the ability to persuade lawmakers across the board that rules to combat these varieties of deepfakes are a necessity. Tech firms and politicians have to know that the times of dithering are over. The people creating these deepfakes should be held accountable. 

You once caused an actual earthquake. Winning the fight against nonconsensual deepfakes would have a fair more earth-shaking impact.

LEAVE A REPLY

Please enter your comment!
Please enter your name here