Connect with us

Technology

Teen sues AI tool maker over fake nude images

Published

on

Teen sues AI tool maker over fake nude images

NEWYou can now listen to Fox News articles!

A teenager in New Jersey has filed a major lawsuit against the company behind an artificial intelligence (AI) “clothes removal” tool that allegedly created a fake nude image of her. 

The case has drawn national attention because it shows how AI can invade privacy in harmful ways. The lawsuit was filed to protect students and teens who share photos online and to show how easily AI tools can exploit their images.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

LEAKED META DOCUMENTS SHOW HOW AI CHATBOTS HANDLE CHILD EXPLOITATION

Advertisement

How the fake nude images were created and shared

When she was 14, the plaintiff posted a few photos of herself on social media. A male classmate used an AI tool called ClothOff to remove her clothing in one of those pictures. The altered photo kept her face, making it look real.

The fake image quickly spread through group chats and social media. Now 17, she is suing AI/Robotics Venture Strategy 3 Ltd., the company that operates ClothOff. A Yale Law School professor, several students and a trial attorney filed the case on her behalf.

A New Jersey teen is suing the creators of an AI tool that made a fake nude image of her. (iStock)

The suit asks the court to delete all fake images and stop the company from using them to train AI models. It also seeks to remove the tool from the internet and provide financial compensation for emotional harm and loss of privacy.

The legal fight against deepfake abuse

States across the U.S. are responding to the rise of AI-generated sexual content. More than 45 states have passed or proposed laws to make deepfakes without consent a crime. In New Jersey, creating or sharing deceptive AI media can lead to prison time and fines.

Advertisement

At the federal level, the Take It Down Act requires companies to remove nonconsensual images within 48 hours after a valid request. Despite new laws, prosecutors still face challenges when developers live overseas or operate through hidden platforms.

APPARENT AI MISTAKES FORCE TWO JUDGES TO RETRACT SEPARATE RULINGS

courtroom and gavel

The lawsuit aims to stop the spread of deepfake “clothes-removal” apps and protect victims’ privacy. (iStock)

Why legal experts say this case could set a national precedent

Experts believe this case could reshape how courts view AI liability. Judges must decide whether AI developers are responsible when people misuse their tools. They also need to consider whether the software itself can be an instrument of harm.

The lawsuit highlights another question: How can victims prove damage when no physical act occurred, but the harm feels real? The outcome may define how future deepfake victims seek justice.

Is ClothOff still available?

Reports indicate that ClothOff may no longer be accessible in some countries, such as the United Kingdom, where it was blocked after public backlash. However, users in other regions, including the U.S., still appear able to reach the company’s web platform, which continues to advertise tools that “remove clothes from photos.”

Advertisement

On its official website, the company includes a short disclaimer addressing the ethics of its technology. It states, “Is it ethical to use AI generators to create images? Using AI to create ‘deepnude’ style images raises ethical considerations. We encourage users to approach this with an understanding of responsibility and respect for others’ privacy, ensuring that the use of undress app is done with full awareness of ethical implications.”

Whether fully operational or partly restricted, ClothOff’s ongoing presence online continues to raise serious legal and moral questions about how far AI developers should go in allowing such image-manipulation tools to exist.

Insurance data breach exposes sensitive info of 1.6 million people

This case could set a national precedent for holding AI companies accountable for misuse of their tools. (Kurt “CyberGuy” Knutsson)

Why this AI lawsuit matters for everyone online

The ability to make fake nude images from a simple photo threatens anyone with an online presence. Teens face special risks because AI tools are easy to use and share. The lawsuit draws attention to the emotional harm and humiliation caused by such images.

Parents and educators worry about how quickly this technology spreads through schools. Lawmakers are under pressure to modernize privacy laws. Companies that host or enable these tools must now consider stronger safeguards and faster takedown systems.

Advertisement

What this means for you

If you become a target of an AI-generated image, act quickly. Save screenshots, links and dates before the content disappears. Request immediate removal from websites that host the image. Seek legal help to understand your rights under state and federal law.

Parents should discuss digital safety openly. Even innocent photos can be misused. Knowing how AI works helps teens stay alert and make safer online choices. You can also demand stricter AI rules that prioritize consent and accountability.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Kurt’s key takeaways

This lawsuit is not only about one teenager. It represents a turning point in how courts handle digital abuse. The case challenges the idea that AI tools are neutral and asks whether their creators share responsibility for harm. We must decide how to balance innovation with human rights. The court’s ruling could influence how future AI laws evolve and how victims seek justice.

If an AI tool creates an image that destroys someone’s reputation, should the company that made it face the same punishment as the person who shared it? Let us know by writing to us at Cyberguy.com.

Advertisement

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

Copyright 2025 CyberGuy.com.  All rights reserved.

Advertisement

Technology

You need to listen to the brutally oppressive I’ve Seen All I Need to See

Published

on

You need to listen to the brutally oppressive I’ve Seen All I Need to See

There are only a handful of albums that I think qualify as genuinely scary. You Won’t Get What You Want by Daughters, and Swans To Be Kind both immediately come to mind. But those records come with… let’s say, baggage. I’ve Seen All I Need to See lacks some of the atmospheric spookiness of To Be Kind and the flashes of pop-tinged menace of You Won’t Get What You Want, but it makes up for that with unrelenting brutality. It’s not the soundtrack to a slasher film, it’s the most violent scene in the bleakest horror film, rendered as blown-out drums and detuned guitar.

The album opens with a reading of Douglas Dunn’s The Kaleidoscope, a poem about being trapped in a cycle of grief, as sparse drums boom arhythmically alongside bursts of noise and a low metallic drone. As it transitions into the distant shriek of vocalist / guitarist Chip King, “A Lament” sputters in fits and starts as it struggles to take flight.

Good art is not necessarily pleasant art.

That sets the tone for the record, which is less a collection of songs and more a relentless monolith erected in tribute to the power of distortion. And this is where I admit, I’ve Seen All I Need to See won’t be for everyone. It’s largely atonal, tracks can blend into each other, and even when the drums pick the pace up beyond funeral dirge, the songs feel weighed down, like the band is trying to play their way out of a bog.

That’s not to say there aren’t moments of catharsis to be found. The City is Shelled in particular, erupts towards its back end as King’s vocals become a Goblin-esque croak over pounding piano chords, delivering one of the few moments of genuine melodicism (even if it’s buried under a skyscraper of fuzz).

Advertisement

Even though it’s only 38 minutes long, at times, I’ve Seen All I Need to See can feel like an endurance exercise. But, like a marathon, that doesn’t mean it’s not worth enduring. There is beauty in its brutality. It’s haunting and vicious in the way that, say, Bring Her Back is. Good art is not necessarily pleasant art.

If you’re looking for a record that conjures horror movie vibes without devolving into camp. Something that feels genuinely dangerous and frightening, and not just merely kind of spooky, The Body’s I’ve Seen All I Need to See is what you’re looking for. The record is available on Bandcamp and most streaming services, including Apple Music, Tidal, Deezer, YouTube Music, and Spotify.

Continue Reading

Technology

Apple says Jon Prosser ‘has not indicated’ when he may respond to lawsuit

Published

on

Apple says Jon Prosser ‘has not indicated’ when he may respond to lawsuit

Earlier this week, Jon Prosser, who is being sued by Apple for allegedly stealing trade secrets, told The Verge that he has been “in active communications with Apple since the beginning stages of this case.” But Apple, in a new filing on Thursday that was reported on by MacRumors, said that while Prosser has “publicly acknowledged” Apple’s complaint, he “has not indicated whether he will file a response to it or, if so, by when.”

Prosser didn’t immediately reply to a request for comment from The Verge. Apple sued Prosser, who posted videos earlier this year showing off features that would debut in iOS 26 ahead of their official announcement, and another defendant, Michael Ramacciotti, in July. The company alleged that Prosser and Ramacciotti had “a coordinated scheme to break into an Apple development iPhone, steal Apple’s trade secrets, and profit from the theft.”

A clerk already entered a default against Prosser last week, which means he hasn’t responded to the lawsuit and that the case can move forward. In Thursday’s filing, Apple said it “intends to file a default judgment seeking damages and an injunction against him.”

Thursday’s filing also includes statements from Ramacciotti. While Ramacciotti “admits to” providing information about iOS 26 to Prosser, “no underlying plan, conspiracy, or scheme was formed” between them, Ramacciotti said. He also claimed that he “had no intent to monetize this information when he contacted Mr. Prosser, nor was there any arrangement at the time the information was conveyed that he would be compensation [sic].”

Apple and Ramacciotti have also “informally discussed settlement,” according to the filing.

Advertisement
Continue Reading

Technology

Scientists spot skyscraper-sized asteroid racing through solar system

Published

on

Scientists spot skyscraper-sized asteroid racing through solar system

NEWYou can now listen to Fox News articles!

Astronomers have reportedly discovered a skyscraper-sized asteroid moving through our solar system at a near record-breaking pace.

The asteroid, named 2025 SC79, circles the sun once every 128 days, making it the second-fastest known asteroid orbiting in the solar system.

It was first observed by Carnegie Science astronomer Scott S. Sheppard Sept. 27, according to a statement from Carnegie Science.

UFO MANIA GRIPS SMALL TOWN AFTER MYSTERIOUS GLOWING OBJECT SIGHTING GOES VIRAL

Advertisement

A skyscraper-size asteroid, named 2025 SC79, was discovered in September, hidden in the sun’s glare. (Carnegie Science)

The asteroid is the second known object with an orbit inside Venus, the statement said. It crosses Mercury’s orbit during its 128-day trip around the sun.

“Many of the solar system’s asteroids inhabit one of two belts of space rocks, but perturbations can send objects careening into closer orbits where they can be more challenging to spot,” Sheppard said. “Understanding how they arrived at these locations can help us protect our planet and also help us learn more about solar system history.”

The celestial body is now traveling behind the sun and will be invisible to telescopes for several months.

HARVARD PHYSICIST SAYS MYSTERIOUS INTERSTELLAR OBJECT COULD BE NUCLEAR-POWERED SPACESHIP

Advertisement

Sheppard’s search for so-called “twilight” asteroids helps identify objects that could pose a risk of crashing into Earth, the statement said.

The work, which is partially funded by NASA, uses the Dark Energy Camera on the National Science Foundation’s Blanco 4-meter telescope to look for “planet killer” asteroids in the glare of the sun that could pose a danger to Earth.

The NSF’s Gemini telescope and Carnegie Science’s Magellan telescopes were used to confirm the sighting of 2025 SC79, Carnegie Science said. 

The fastest known asteroid was also discovered by Sheppard, who studies solar system objects including moons, dwarf planets and asteroids. and his colleagues in 2021.

Advertisement

That one takes 133 days to orbit the sun.

Continue Reading

Trending