Deepfake bill would open door for victims to sue creators

Catch up with NBC News Clone on today's hot topic: Rcna136434 - Breaking News | NBC News Clone. Our editorial team reformatted this story for clarity and speed.

Sens. Dick Durbin, Lindsey Graham and Josh Hawley plan to introduce the Disrupt Explicit Forged Images and Non-Consensual Edits Act on Tuesday.

Senate Judiciary Committee Chair Dick Durbin, D-Ill., on Capitol Hill on May 18.Francis Chung / Politico via AP file
SHARE THIS —

A bipartisan group of three senators is looking to give victims of sexually explicit deepfake images a way to hold their creators and distributors responsible.

Sens. Dick Durbin, D-Ill.; Lindsey Graham, R-S.C.; and Josh Hawley, R-Mo., plan to introduce the Disrupt Explicit Forged Images and Non-Consensual Edits Act on Tuesday, a day ahead of a Senate Judiciary Committee hearing on internet safety with CEOs from Meta, X, Snap and other companies. Durbin chairs the panel, while Graham is the committee’s top Republican.

Victims would be able to sue people involved in the creation and distribution of such images if the person knew or recklessly disregarded that the victim did not consent to the material. The bill would classify such material as a “digital forgery” and create a 10-year statute of limitations. 

“The volume of deepfake content available online is increasing exponentially as the technology used to create it has become more accessible to the public,” Durbin’s office said in a news release. “The laws have not kept up with the spread of this abusive content.”

In the release, the senators noted that Taylor Swift had recently become a victim of such deepfakes, which spread across Elon Musk’s X and later Instagram and Facebook.

“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit deepfakes is very real,” Durbin said. “Victims have lost their jobs, and may suffer ongoing depression or anxiety.”

Nonconsensual sexually explicit deepfakes use AI technology to create original fake imagery, “undress” real photos, and “face-swap” people into pornographic videos. Victims are overwhelmingly women and girls. 

“Nobody, neither celebrities nor ordinary Americans, should ever have to find themselves featured in AI pornography,” Hawley said. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court. This bill will make that a reality.”

Washington has taken notice, though no major legislation has been passed. In May 2023, Rep. Joe Morelle, D-N.Y., introduced the Preventing Deepfakes of Intimate Images Act, which would criminalize the nonconsensual production and sharing of AI-generated sexually explicit material.

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone