Lawyer Sean Smith has seen up close how nonconsensual deepfakes, a form of image-based sexual abuse, can ruin lives.
Smith, a family law attorney with the Roseland, New Jersey, firm Brach Eichler, recently represented both the families of minor victims and perpetrators throughout educational disciplinary proceedings.
His clients have included teen girls whose images were taken from social media, then digitally “undressed” by their male classmates, who used software powered by artificial intelligence.
The apps and websites capable of creating explicit nonconsensual deepfakes typically market themselves as satisfying a curiosity or providing entertainment. As a result, users likely don’t understand that the resulting imagery can inflict painful, lifelong trauma on the person whose likeness has been stolen — who is almost always a girl or woman. The victim may never be able to remove every synthetic photo or video from the internet, given how difficult it is to track and delete such content.
Explicit deepfakes are traumatic. How to deal with the pain.
This can lead to professional, personal, and financial devastation for survivors. The same can be true for perpetrators when their name and reputation is associated with creating nonconsensual deepfakes. They may face suspension or expulsion if they’re a student, and also face criminal and civil penalties, depending on where they live.
“It destroys lives on every side,” Smith told Mashable.
This typically isn’t made clear to youth and adult users who engage in image-based sexual abuse.
Is it illegal to make a deepfake?
Despite the absence of information about the consequences of nonconsensual deepfakes, their rise has prompted several states to pass legislation criminalizing them.
Meanwhile, Congress has introduced but has yet to vote on a bill that would give victims the right to file a civil suit against perpetrators. A separate federal bill would criminalize the publication of nonconsensual intimate imagery, including that created by AI, and require social media companies to remove that content at a victim’s request.
In some states, offenders can face civil penalties should the victim successfully sue them for damages. Their wages may be garnished or their property seized to pay for such damages.
Last year, Illinois amended an existing law in order to make deepfake offenders liable when they distribute nonconsensual synthetic images. A survivor can sue the person who disseminates the content for damages, which may result from emotional distress, the cost of mental health treatment, the loss of a job, and other related costs.
“When the laws get enforced, it’s going to be a black mark that will follow a person for a very long time…”
In New York, dissemination of nonconsensual deepfakes can lead to a year spent in jail, a fine, and a civil suit. Florida imposes both criminal and civil penalties for the “promotion” of nonconsensual synthetic material. The state’s law also expanded the definition of “child pornography” to include deepfakes of minors engaged in sexual conduct.
Indiana, Texas, and Virginia are among the states that have made the creation of nonconsensual deepfakes punishable by jail time.
Many states, however, don’t yet have laws that make the creation or distribution of deepfakes illegal, or give victims the right to sue. Additionally, it may be difficult for victims to pursue criminal or civil penalties against the person who promoted the content because their identity is unknown, or because law enforcement is understaffed to investigate potential crimes.
But Matthew B. Kugler, professor of law at Northwestern University, says that shouldn’t give people a false sense of security.
“When the laws get enforced, it’s going to be a black mark that will follow a person for a very long time, and no one’s going to feel bad about the fact that that black mark follows [the offender] for a very long time,” Kugler says.
In 2020, Kugler studied public attitudes toward sexually explicit, nonconsensual deepfake videos in a survey of 1,141 U.S. adults. The vast majority of the respondents wanted to criminalize the act.
There is another potential legal consequence to creating nonconsensual deepfake imagery, regardless of whether the offender’s state imposes criminal or civil penalties.
Adam Dodge, a lawyer and founder of Ending Tech-Enabled Abuse (EndTAB), says that a victim can file for a protective or restraining order if she knows who’s responsible for the creation or distribution of the imagery. In many jurisdictions, image-based abuse qualifies as a form of harassment.
Such restraining orders are discoverable in background searches conducted by potential employers, Dodge says. A restraining order can also be applied to a youth offender. Though a minor’s legal record is meant to be sealed, Dodge has seen instances where the information becomes public.
What happens to minors who create or share a nonconsensual deepfake
Teens who find deepfake apps or sites, either through word of mouth or ruthless internet marketing and search strategies, often don’t grasp the potential fallout for victims or themselves, says Smith.
He notes that because the phenomenon is so new, school-based discipline can vary widely. At public schools, which are legally obligated to keep students enrolled to the extent that it’s possible, the punishment can vary from brief in- or out-of-school suspensions.
But Smith says that private schools, with their own codes of conduct, may quickly escalate to expulsion.
The victim’s parents may also pursue legal action in an effort to hold the perpetrator and their family accountable. Though Smith hasn’t seen such a case yet, he expects some parents to begin filing civil lawsuits against a perpetrator’s parents on the grounds of negligent supervision. Any damages won could potentially be covered by homeowner’s insurance, unless the parents’ carrier restricts such claims.
Teens could also be subject to criminal penalties, including those related to child pornography and other criminal statutes. Smith is aware of juvenile proceedings against teens who’ve created nonconsensual deepfakes. Though they did not serve time in jail, the offenders entered into a private agreement with the state as culpability for their actions.
In Florida, however, two teens were arrested and charged with felonies last December for disseminating nonconsensual deepfakes.
Smith says that parents and teens urgently need to understand these and other consequences.
“The problem with this technology is that the parents and the kids don’t realize how big a mistake the use of the technology is,” Smith says. “How just the introduction of the technology onto a cellphone…can create this much larger lifetime mistake.”
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.
Source : The consequences of making a nonconsensual deepfake