I Shouldn’t have to Accept being in Deepfake Porn > 자유게시판

I Shouldn’t have to Accept being in Deepfake Porn

페이지 정보

작성일 23-07-25 02:52

본문


Because I was in the public eye, Analog Midjourney somebody synthesized explicit videos of me.

Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn’t shocked. For more than a year, I have been the target of a widespread online harassment campaign, and deepfake porn-whose creators, using artificial intelligence, generate explicit video clips that seem to show real people in sexual situations that never actually occurred-has become a prized weapon in the arsenal misogynists use to try to drive women out of public life. The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology-and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent. Many commentators have been tying themselves in knots over the potential threats posed by artificial intelligence-deepfake videos that tip elections or start wars, job-destroying deployments of ChatGPT and other generative technologies. Yet policy makers have all but ignored an urgent AI problem that is already affecting many lives, including mine.

Read: We haven’t seen the worst of fake news

Last year, I resigned as head of the Department of Homeland Security’s Disinformation Governance Board, a policy-coordination body that the Biden administration let founder amid criticism mostly from the right. In subsequent months, at least three artificially generated videos that appear to show me engaging in sex acts were uploaded to websites specializing in deepfake porn. The images don’t look much like me; the generative-AI models that spat them out seem to have been trained on my official U.S. government portrait, taken when I was six months pregnant. Whoever created the videos likely used a free "face swap" tool, essentially pasting my photo onto an existing porn video. In some moments, the original performer’s mouth is visible while the deepfake Frankenstein moves and my face flickers. But these videos aren’t meant to be convincing-all of the websites and the individual videos they host are clearly labeled as fakes. Although they may provide cheap thrills for the viewer, their deeper purpose is to humiliate, shame, and objectify women, especially women who have the temerity to speak out. I am somewhat inured to this abuse, after researching and writing about it for years. But for other women, especially those in more conservative or patriarchal environments, appearing in a deepfake-porn video could be profoundly stigmatizing, even career- or life-threatening.

As if to underscore video makers’ compulsion to punish women who speak out, one of the videos to which Google alerted me depicts me with Hillary Clinton and Greta Thunberg. Because of their global celebrity, deepfakes of the former presidential candidate and the climate-change activist are far more numerous and more graphic than those of me. Users can also easily find deepfake-porn videos of the singer Taylor Swift, the actress Emma Watson, and the former Fox News host Megyn Kelly; Democratic officials such as Kamala Harris, Nancy Pelosi, and Alexandria Ocasio-Cortez; the Republicans Nikki Haley and Elise Stefanik; and countless other prominent women. By simply existing as women in public life, we have all become targets, stripped of our accomplishments, our intellect, and our activism and reduced to sex objects for the pleasure of millions of anonymous eyes.

Men, of course, are subject to this abuse far less frequently. In reporting this article, I searched the name Donald Trump on one prominent deepfake-porn website and turned up one video of the former president-and three entire pages of videos depicting his wife, Melania, and daughter Ivanka. A 2019 study from Sensity, a company that monitors synthetic media, estimated that more than 96 percent of deepfakes then in existence were nonconsensual pornography of women. The reasons for this disproportion are interconnected, and are both technical and motivational: The people making these videos are presumably heterosexual men who value their own gratification more than they value women’s personhood. And because AI systems are trained on an internet that abounds with images of women’s bodies, much of the nonconsensual porn that those systems generate is more believable than, say, computer-generated clips of cute animals playing would be.

Read: The Trump AI deepfakes had an unintended side effect

As I looked into the provenance of the videos in which I appear-I’m a disinformation researcher, after all-I stumbled upon deepfake-porn forums where users are remarkably nonchalant about the invasion of privacy they are perpetrating. Some seem to believe that they have a right to distribute these images-that because they fed a publicly available photo of a woman into an application engineered to make pornography, they have created art or a legitimate work of parody. Others apparently think that simply by labeling their videos and images as fake, they can avoid any legal consequences for their actions. These purveyors assert that their videos are for entertainment and educational purposes only. But by using that description for videos of well-known women being "humiliated" or "pounded"-as the titles of some clips put it-these men reveal a lot about what they find pleasurable and informative.

Ironically, some creators who post in deepfake forums show great concern for their own safety and privacy-in one forum thread that I found, a man is ridiculed for having signed up with a face-swapping app that does not protect user data-but insist that the women they depict do not have those same rights, because they have chosen public career paths. The most chilling page I found lists women who are turning 18 this year; they are removed on their birthdays from "blacklists" that deepfake-forum hosts maintain so they don’t run afoul of laws against child pornography.

Effective laws are exactly what the victims of deepfake porn need. Several states-including Virginia and California-have outlawed the distribution of deepfake porn. But for victims living outside these jurisdictions or seeking justice against perpetrators based elsewhere, these laws have little effect. In my own case, finding out who created these videos is probably not worth the time and money. I could attempt to subpoena platforms for information about the users who uploaded the videos, but even if the sites had those details and shared them with me, if my abusers live out of state-or in a different country-there is little I could do to bring them to justice.

Representative Joseph Morelle of New York is attempting to reduce this jurisdictional loophole by reintroducing the Preventing Deepfakes of Intimate Images Act, a proposed amendment to the 2022 reauthorization of the Violence Against Women Act. Morelle’s bill would impose a nationwide ban on the distribution of deepfakes without the explicit consent of the people depicted in the image or video. The measure would also provide victims with somewhat easier recourse when they find themselves unwittingly starring in nonconsensual porn.

In the absence of strong federal legislation, the avenues available to me to mitigate the harm caused by the deepfakes of me are not all that encouraging. I can request that Google delist the web addresses of the videos in its search results and-though the legal basis for any demand would be shaky-have my attorneys ask online platforms to take down the videos altogether. But even if those websites comply, the likelihood that the videos will crop up somewhere else is extremely high. Women targeted by deepfake porn are caught in an exhausting, expensive, endless game of whack-a-troll.

Read: AI is about to make social media much more toxic

The Preventing Deepfakes of Intimate Images Act won’t solve the deepfake problem; the internet is forever, and deepfake technology is only becoming more ubiquitous and its output more convincing. Yet especially because AI grows more powerful by the month, adapting the law to an emergent category of misogynistic abuse is all the more essential to protect women’s privacy and safety. As policy makers worry whether AI will destroy the world, I beg them: Let’s first stop the men who are using it to discredit and humiliate women.