SAG-AFTRA Backs Bill Banning Nonconsensual Digital Sexualization Of Actors


SAG-AFTRA is backing a bill in the California Legislature that would allow actors to sue producers who use technology to sexualize their scenes without their permission. Senate Bill 564 also would give Californians the right to sue anyone who creates “deepfake” pornography or fake sex tapes.

“Now is the time to establish lasting protections in this industry against sexual abuse,” the union said. “Unfortunately, mainstream filmmakers have already used body doubles and/or technology to depict performers in the nude or engaging in simulated sex acts without consent. SAG-AFTRA is not going to allow free software to exacerbate this problem. Content creators need to respect union rules, irrespective of whatever new technology is available to them. If a filmmaker wants a performer to act in a sex scene, they need to hire them under a union contract and obtain meaningful consent.”

A performer working under a union contract would be able to sue in open court under this law. For example, if a producer uses technology to sexualize a scene in which the performer acted. A performer working under a union contract also would have access to collectively bargained rights, if they so choose.

SAG-AFTRA has been lobbying for a legislative fix for more than a year to address new face-swapping technologies that have been used to digitally superimpose the faces of its members onto the bodies of porn stars. That technology – known as “deepfaking” – has hijacked the likenesses of several famous actresses and singers to make it appear that they were performing in pornographic films.

The bill, introduced by State Sen. Connie Leyva, carves out several First Amendment exceptions. Under the proposed law, content creators would not be liable if they can prove that the content was created in the course of reporting unlawful activity, a legal proceeding or a law enforcement officer discloses it in the course of their official duties; the content was in relation to a matter of legitimate public concern; the content was in a work of political or newsworthy value; or if the content was made for purposes of commentary or criticism or is otherwise protected by the California Constitution or the U.S. Constitution.

“These exemptions reflect the free speech rights provided to creators under the First Amendment,” SAG-AFTRA said in a statement. “However, the First Amendment is not absolute, and the law has long respected other competing interests.”

Internet platforms would not be required to take down content that violates the proposed law, however. “Unfortunately, Internet platforms and search engines are under no legal obligation to remove infringing content on their sites,” the union said. “This is because Section 230 of the Communications Decency Act pre-empts state laws and immunizes platforms from the acts of its users.”

Google, Twitter and Reddit, however, already have adopted voluntary policies against nonconsensual nude performances and provide mechanisms for victims to take down content or search results.

Said SAG-AFTRA president Gabrielle Carteris: “Filmmakers have an obligation to obtain meaningful consent when producing sexually explicit material. To perform intimate scenes is a serious decision for performers, there is incredible vulnerability with potential to affect their home life, mental health, career and public perception. Sexually explicit material must be carefully scripted and agreed upon in advance. This bill safeguards performers ensuring that they continue working in a dignified and safe environment.”

Added David White, the union’s national executive director. “We’re entering a new digital era in which content creators use technology to manipulate images to depict individuals as engaging in sexual activity or as performing in the nude without their consent or participation. And, it’s not just celebrities who are at risk. Every person is a potential target for this form of image-based sexual abuse. We need to push hard for laws that target this kind of abuse, hold bad actors accountable for their actions, and establish rules around consent and civil remedies for victims, so that bad actors are deterred from making the videos in the first place.”

This article was printed from