Police investigate AI-generated nude photos of students in Beverly Hills

This version of Beverly Hills Ai Nude Photos Middle School Vista Rcna140965 - Technology and Innovation | NBC News Clone was adapted by NBC News Clone to help readers digest key facts more efficiently.

Reports from a middle school allege that students were using AI tools to create and share fake nude photos of their classmates.
Get more newsBeverly Hills Ai Nude Photos Middle School Vista Rcna140965 - Technology and Innovation | NBC News Cloneon

Beverly Hills, California, police are investigating reports that students made fake nude photos of their classmates at a middle school, a city official said Wednesday. 

Deputy City Manager Keith Sterling said the department is investigating students at Beverly Vista Middle School who authorities say used artificial intelligence tools to create the images and share them with other students.

School officials were made aware of the “AI-generated nude photos” last week, Sterling said in a letter to parents.

Students and parents told NBC News they were afraid to go to school or send their children to school after the incident, which follows a string of similar AI-generated nude photo cases at schools around the world. The emergence of sophisticated and accessible apps and programs that “undress” or “nudify” photos and “face-swap” tools that superimpose victims’ faces onto pornographic content have led to an explosion of nonconsensual sexually explicit deepfakes that predominantly target women and girls.

Security guards at Beverly Vista Middle School in Beverly Hills, Calif.
Security guards at Beverly Vista Middle School in Beverly Hills, Calif., on Monday.Jason Armond / Los Angeles Times via Getty Images

The president of the Cyber Civil Rights Initiative, Mary Anne Franks, a professor at George Washington University Law School, has told NBC News that the AI-generated nude photos of students could be illegal depending on the facts of a case and what the images depict. 

For example, Franks said, a criminal case could involve sexual harassment, or the material could be considered child sexual abuse material (CSAM, a term experts and advocates favor over "child pornography"). Not all nude photos of children, AI-generated or not, fall under the legal definition of CSAM — but some do, including some AI-generated depictions. For depictions to be illegal, they must show sexually explicit conduct, which is a higher bar than nudity alone.

“We do have federal and other prohibitions against certain depictions of actual children’s faces or other parts of their bodies that are mixed in with other things,” Franks said. “Depending on the factual circumstances, there could be behavior that rises to the level of harassment or stalking.”

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone