Explicit deepfakes are traumatic. How to deal with the pain.

Share

Therapist Francesca Rossi works with clients who’ve had real images of themselves turned into sexually explicit content, without their consent. All of her current clients identify as woman.

This type of image-based abuse, known as an explicit deepfake generated by artificial intelligence, is frequently perpetrated by a current or former intimate partner, or a known friend, coworker, or neighbor, as a form of harassment and stalking. Rossi, a licensed clinical social worker in New York, has seen her clients heal from this betrayal, but the journey is long, and rarely predictable.

In some states, creating and distributing an explicit deepfake might be against the law, but even so, local law enforcement may have few resources to investigate such cases.

The victim typically has to marshal her own response. Among her options are attempting to track down the imagery and issue takedown notices where it appears, but there’s no guarantee she’ll locate all of it. Rossi says explicit deepfakes are often traded between individuals, then downloaded, without the victim’s knowledge.

SEE ALSO:

What to do if someone makes a deepfake of you

Feeling successful one day doesn’t mean the next day will be the same. The imagery may pop up on new platforms. The perpetrator may send it to the survivor’s friends, family, and employer. Rossi says survivors naturally become hypervigilant. They often, impossibly, want to avoid the internet altogether. Sometimes they become fixated on monitoring imagery of themselves online, using the internet excessively to do so.

“Being victimized through deepfakes can erase your sense of reality,” says Rossi, noting the dissonance survivors feel because the fake imagery looks real and convincing. “They distort your understanding of the world and everything you know to be true.”

Why safety planning is critical for healing

Rossi says that people need to feel safe in order to restore their sense of reality. Creating that safety happens through measures big and small.

In the beginning, when the deepfakes are discovered, Rossi says that it’s important to gather trusted loved ones who can offer emotional support, help locate where the deepfakes appear, and try to remove them, or develop a strategy for navigating this complex process, possibly in partnership with law enforcement or attorneys.

The U.S.-based Cyber Civil Rights Initiative has an image abuse helpline, along with a thorough guide for what to do once you’ve become a victim. In the United Kingdom, people can turn to the Revenge Porn Helpline, which aids survivors of intimate image abuse.

In addition, people may want to remove their personal information from databases maintained by data brokers, which can be done through paid services or by contacting the brokers directly. That data, including a person’s home address and the names of their family members, can be used for doxxing, harassment, and stalking

Kate Keisel prioritizes physical and psychological safety planning in her work as cofounder of the New Jersey-headquartered Sanar Institute, which provides trauma-specific mental health services to survivors of interpersonal violence, including image-based sexual abuse.

Keisel says that survivors are often told by well-meaning supporters to stay off the internet when that’s simply not an option for personal and professional reasons. That’s why physical safety planning can include an understanding that even after initial successful takedown notices, there’s no guarantee the imagery won’t surface again.

Instead of hoping that the abuse will definitively end, Keisel recommends that survivors implement boundaries related to how they spend their time online, particularly if they find it distressing not to look for images. To increase their psychological safety, survivors may want to set a limit on the number of hours they devote to searching for images of themselves.

Keisel says it can be helpful for survivors to identify and stick to tasks that feel squarely within their control, like removing their personal information from the internet.

Getting and staying grounded

While practical steps, like issuing takedown notices, are key to safety planning, both Keisel and Rossi say survivors also benefit from grounding and mindfulness practices that decrease psychological distress and anxiety.

Survivors suffer particularly because their nervous system perceives a constant threat; deepfakes, after all, have the potential to re-emerge, or may still exist online or in someone else’s possession.

A therapist can teach a survivor new techniques, but Keisel says activities that bring someone into the present moment, so they can fully inhabit their body, can also be powerfully calming. These can include trauma-sensitive yoga and Tai Chi.

Rossi also recommends calming strategies that stimulate the senses, such as lighting incense or a candle, and laughing, which can reduce the body’s response to fear.

“We can’t think our way out of trauma,” says Keisel. That’s why she believes “somatic,” or body-based practices, help a survivor feel safe in the present moment, even if their life has been turned upside down.

Keisel says that there will be moments when a survivor’s nervous system goes into panic mode because of a new development, but that it’s possible to learn skills to better tolerate that distress.

The combination of safety planning, gaining more control, and self-soothing can put someone on the path to healing, Keisel says.

There is hope

Rossi and Keisel are among several therapists and professionals in the U.S. who specialize in treating survivors of image-based sexual abuse, but their expertise is uncommon. Rossi says she has more consultation requests than she can handle; they’ve increased markedly since AI software and apps capable of producing explicit deepfakes became more widespread late last year.

The abuse is accelerating at a pace that lawmakers and tech companies aren’t matching, though the White House recently issued a call to action for digital platforms and services to tackle the problem.

The White House’s recommendations included Congressional action to strengthen legal protections for survivors of image-base sexual abuse, and provide them with critical resources.

Keisel says that those who want to talk to a therapist should consider interviewing them about their treatment practices to see if they’re a right fit. Survivors might avoid therapists who don’t understand image-based sexual abuse, or who aren’t trained to use a trauma-sensitive approach.

But Keisel doesn’t want survivors to give up on the idea of healing, even if it sometimes feels unimaginable.

“There’s this idea that those of us who’ve experienced this level of trauma are going to be stuck in place where we can’t move forward,” Keisel says. “When we have the right support in place, we move past these things in life.”

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

Source : Explicit deepfakes are traumatic. How to deal with the pain.