Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wp-graphql domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home/service95-staging/htdocs/staging.service95.com/wp-includes/functions.php on line 6114
“We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse  | Service95
Service95 Logo
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 
Issue #120 “We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 

All products featured are independently chosen by the Service95 team. When you purchase something through our shopping links, we may earn an affiliate commission.

Service95 Deepfake image by Delfina Carmona/Kintzing © Delfina Carmona/Kintzing

“We Need A Digital #MeToo” – The Filmmaker Fighting The Rise Of Deepfake Abuse 

AI-generated deepfake content, which once quietly lurked in the dark corners of the internet, has now become commonplace. Research has found that between 90% and 95% of deepfakes are nonconsensual porn, with 90% of them depicting women. Websites dedicated to the circulation of nonconsensual deepfake pornography garner millions of views, and there is no shortage of online platforms that can remove clothing from uploaded images with one click. Yet, despite the prevalence of digital sexual violence towards women, cultural scepticism around the significance of the harm persists. 

The widespread accessibility of AI has resulted in seemingly endless ethical ramifications. In 2020, filmmaker and activist Sophie Compton witnessed these consequences when she came across pornographic deepfake forums on anonymous imageboard site 4chan. “There are spaces on the internet that reveal so many dynamics around misogyny, entitlement and how female bodies can be misused and manipulated,” says Compton. “It spoke to me on a visceral level. There was little conversation around deepfakes at the time.” Four years later, Compton’s film Another Body, which addresses deepfake abuse, is being screened globally. Yet the conversation remains disproportionate to the magnitude of the issue. 

“We need a digital #MeToo,” says Compton. Alongside her film, she created the #MyImageMyChoice campaign, which aims to bring change through amplifying survivor testimonies. “People forget that things that occur online are done by real bodies and are felt in real bodies,” she adds. 

Using language such as ‘online’ versus ‘real life’ to distinguish between types of violence has perpetuated rhetoric that often minimises the gravity of digital sexual violence. This can have devasting offline consequences for survivors. University of Minnesota professor Dr Carolyn Porta explains, “This idea of real versus not real has been debunked. If someone breaks into my house, it didn’t happen to my physical body, but I felt like a victim. It didn’t have to physically touch me for me to feel scared and threatened.”  

While the issue of abuse is not specific to AI, in the field of technological advancement, public safety often comes second to profit. Emerging technology consultant Nina Patel believes that the tech industry’s model of innovation contributes to the digital sexual violence epidemic. “We sacrifice our rights for the sake of innovation. Within the ecosystem of technology, an acceptable way to innovate is to ‘build it and break it quickly’, because they have determined the way to innovate is to iterate. The result is products that aren’t designed with safety or rights by design,” says Patel. “This approach isn’t an optimistic way forward. If we prioritise human rights in the digital space, then we can’t afford to break it.” Evidently, there needs to be a focus on the preventive rather than the reactive.  

Online platforms use AI monitoring to remove harmful content or rely on users to flag it – a mere band-aid on the hole of a sinking ship. “There’s more to be done than just report and block because that isn’t effective,” says Patel. “Women need to take their power seriously and demand better spaces. We need to hold people who are violent towards women accountable.” When private sectors fail to keep people safe, Patel argues that: “Governments have to step in and balance this desire for innovation and progress.”  

The UK enacted the Online Safety Act in October 2023 and updated it in January this year. The act outlines regulations for tech companies on how to operate and monitor their platforms, and criminalises certain behaviours such as the circulating of nonconsensual intimate pictures – deepfake or not.  

Emma Pickering, head of technology-facilitated abuse and economic empowerment at domestic abuse charity Refuge UK, welcomes the legislation as a positive step forward but she has concerns about its how effective it will be. The UK Office of Communications will be responsible for enforcing the legislation at the tech company level, while the police will handle enforcement at an individual level. “Survivors of intimate image abuse are often left to navigate the situation on their own,” she says. “The risk threshold seems to be increased for physical violence, but anything online seems to be a lower priority. Police argue that they don’t have the resources to respond to every online offence and that it’s difficult to prove who the perpetrator was. But that’s not necessarily the case – with the right resources and technical ability, you can prove it.” Legislation is only as valuable as the capacity in which it is enforced.  

In the US, there are currently no federal laws that address the sharing of nonconsensual deepfake pornography. However, following the public incident of explicit Taylor Swift deepfakes surfacing on X (formerly known as Twitter) and her fanbase rallying to get the images removed from the platform, there has been a wave of proposed legislation. “Every [US] senator started taking our calls after that Taylor Swift story,” Compton explains. “Before, if you spoke to women about this, they were afraid and angry, but it was individualised. Swifties brought a collective rage that helped shift the dial.” 

There is a cultural inclination to respond to deepfake abuse with victim blaming, suggesting that survivors should delete their social media accounts or that deepfakes are the price women pay for a career in the public eye. “Shame has historically been a weapon used against women effectively,” says Compton. “Every person that speaks out, even if it’s just to your mum or friends, starts to take the power out of [the abuse].”  

Using AI to protect their identity, the #MyImageMyChoice movement enables digital sexual violence survivors to share their stories. In March 2024, a summit on deepfake abuse was held online, involving activists, survivors, government officials and representatives from leading tech companies.  

“Despite the risks that can come from speaking out, it has a profound reverberating effect on that individual and others. It has real healing power,” Compton says. #MyImageMyChoice is one of the many campaigns petitioning for stricter regulations, and championing the simple yet somehow revolutionary notion that women should be able to live in the fullness of themselves without the fear of how it could be weaponised against them—both online and offline.  
Find out more about #MyImageMyChoice campaign here. Another Body is available to watch on Apple TV

Jamison Kent is a freelance arts and culture writer based in London   

Read More

Subscribe