Wednesday, February 11, 2026
  • Home
  • Breaking News
  • Politics & Governance
  • Business & Economy
  • Science & Technology
  • Health & Lifestyle
  • Arts & Culture
Spluk.ph
No Result
View All Result
Spluk.ph
No Result
View All Result
Home Science & Technology

Elon Musk’s AI Is Generating Sexual Images Of Women And Girls. Here’s What To Do If It Happens To You.

Spluk.ph by Spluk.ph
January 11, 2026
in Science & Technology
0 0
0
Elon Musk’s AI Is Generating Sexual Images Of Women And Girls. Here’s What To Do If It Happens To You.
Share on FacebookShare on Twitter


Over the previous few weeks, folks on X ― the Elon Musk–owned social media platform ― have used the app’s chatbot, Grok, to generate sexual images of women and ladies with out their consent.

With a number of easy directions ―“put her into a really transparent mini-bikini,” for example ― Grok will digitally strip anybody all the way down to their bikini.

A report by the nonprofit AI Forensics discovered that 2% of 20,000 randomly selected images generated by Grok over the vacations depicted an individual who seemed to be 18 or youthful, together with 30 younger or very younger ladies or ladies in bikinis or clear clothes. Different photos depict women and girls with black eyes, covered in liquid, and looking out afraid.

Regardless of receiving international backlash and regulatory probes in Europe, India and Malaysia, Musk first mocked the scenario by sharing an array of Grok-generated photos, together with one depicting himself in a bikini, alongside laughing-crying emojis.

By Jan. 3, Musk commented on a separate post: “Anybody utilizing Grok to make unlawful content material will endure the identical penalties as in the event that they add unlawful content material.” (We’ll clarify what constitutes unlawful content material in a while.)

“What issues legally and morally is that an actual particular person’s physique and id have been used with out consent to create a sexualized lie.”

– Rebecca A. Delfino, an affiliate professor of regulation who teaches generative AI and Authorized Apply  at Loyola Marymount College

Deepfake nudes are nothing new, however consultants say it’s getting simpler to create and publish them.

Deepfake nudes are nothing new. For years, apps like “DeepNude” have given folks entry to deepfake expertise that enables them to digitally insert women into porn or be stripped naked with out their data. (In fact, males have been victims of sexualized deepfakes as properly, however the analysis signifies that males are extra possible than ladies to perpetrate image-based abuse.)

Nonetheless, Grok’s utilization this week is totally different and arguably extra alarming, mentioned Carrie Goldberg, a victims’ rights legal professional in New York Metropolis.

“The Grok story is exclusive as a result of it’s the primary time there’s a combining of the deepfake expertise, Grok, with a right away publishing platform, X,” she mentioned. “The fast publishing functionality allows the deepfakes to unfold at scale.”

“It must be underscored how weird it’s that the world’s richest man not solely owns the businesses that create and publish deepfakes, however he’s additionally actively selling and goading customers on X to de-clothe harmless folks,” Goldberg added. “Elon Musk feels entitled to strip folks of their energy, dignity, and garments.”

What’s been occurring the previous couple of weeks is unlucky, however none of it’s a shock to Riana Pfefferkorn, a coverage fellow on the Stanford Institute for Human-Centered AI. Her take: This drawback will worsen earlier than it will get higher.

“Each tech service that enables user-generated content material will inevitably be misused to add, retailer and share CSAM (youngster intercourse abuse materials), as CSAM unhealthy actors are very persistent,” she mentioned.

“Unfortunately, while I don’t have any direct insight, x.AI does not seem to have that strong of a corporate culture in that respect, going off Elon Musk’s dismissive reaction to the current scandal as well as previous reporting from a few months ago,” said Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI.

VINCENT FEURAY through Getty Photos

“Sadly, whereas I don’t have any direct perception, x.AI doesn’t appear to have that sturdy of a company tradition in that respect, going off Elon Musk’s dismissive response to the present scandal in addition to earlier reporting from a number of months in the past,” mentioned Riana Pfefferkorn, a coverage fellow on the Stanford Institute for Human-Centered AI.

The upshot is that AI firms should learn to greatest implement strong safeguards towards unlawful imagery. Some firms might have a stronger tradition of “CSAM/nonconsensual deepfake porn isn’t OK.”

Others will attempt to have it each methods, establishing unfastened guardrails for security whereas additionally making an attempt to generate profits from permissible NSFW imagery, Pfefferkorn mentioned.

“Sadly, whereas I don’t have any direct perception, x.AI doesn’t appear to have that sturdy of a company tradition in that respect, going off Elon Musk’s dismissive response to the present scandal in addition to previous reporting from a number of months in the past,” she mentioned.

Victims of this type of exploitation typically really feel powerless and uncertain of what they’ll do to cease the photographs from proliferating. Ladies who’re vocal on-line fear about the identical factor occurring to them.

Omny Miranda Martone, the founding father of the Washington-based Sexual Violence Prevention Affiliation, had deepfake nude movies and pics posted of themselves on-line a number of years again. As an advocate on laws stopping digital sexual violence, Martone wasn’t precisely stunned to be a goal.

“Additionally they despatched the deepfakes to my group, in an try to silence me. I’ve seen this identical tactic used on Twitter with Grok during the last week,” they mentioned.

Martone mentioned they’ve seen a number of cases of a lady sharing her opinion and males who disagree together with her utilizing Grok to create express photos of her.

“In some instances, they’re utilizing these photos to threaten the ladies with in-person sexual violence,” they added.

One of the most persistent misunderstandings about deepfakes depicting nudity is that because an image is “fake,” the harm is somehow less real.

Roc Canals through Getty Photos

Probably the most persistent misunderstandings about deepfakes depicting nudity is that as a result of a picture is “faux,” the hurt is in some way much less actual.

Probably the most persistent beliefs about deepfakes depicting nudity is that as a result of a picture is “faux,” the hurt is in some way much less actual. That assumption is fallacious, mentioned Rebecca A. Delfino, an affiliate professor of regulation who teaches generative AI and authorized observe at Loyola Marymount College.

“These photos may cause critical and lasting harm to an individual’s fame, security, and psychological well-being,” she mentioned. “What issues legally and morally is that an actual particular person’s physique and id have been used with out consent to create a sexualized lie.”

Whereas protections stay uneven, untested and sometimes come too late for victims, Delfino mentioned the regulation is slowly starting to acknowledge that actuality.

“Tales like what’s occurring with Grok matter as a result of public consideration typically drives the authorized and regulatory responses that victims at the moment lack,” she mentioned. “The regulation is lastly beginning to deal with AI-generated nude photos the identical manner it treats different types of nonconsensual sexual exploitation.”

What may be completed if an AI-generated nude is posted of you?

Protect the proof.

If you happen to determine deepfake content material of your self, display screen seize it and report it instantly.

“Probably the most sensible recommendation is to behave rapidly and methodically,” Delfino mentioned. “Protect proof ― screenshots, URLs, timestamps) ―earlier than content material is altered or eliminated. Report the picture to platforms clearly as nonconsensual sexual content material and proceed to comply with up.”

If you happen to’re below 18 in a nude or nudified picture, platforms ought to take that very severely, Pfefferkorn mentioned. Sexually express imagery of youngsters below 18 is unlawful to create or share, and platforms are required to promptly take away such imagery once they study of it and report it to the National Center for Missing & Exploited Children (NCMEC).

“Don’t be afraid to report a nude picture to NCMEC that you just took of your self when you have been underage: there may be additionally a federal regulation saying you may’t be legally punished in the event you report it,” Pfefferkorn added.

And if a minor is concerned, regulation enforcement must be contacted instantly.

“When potential, consulting with a lawyer early will help victims navigate each takedown efforts and potential civil treatments, even the place the regulation remains to be evolving,” Delfino mentioned.

“When possible, consulting with a lawyer early can help victims navigate both takedown efforts and potential civil remedies, even where the law is still evolving,” Delfino said.

Fiordaliso through Getty Photos

“When potential, consulting with a lawyer early will help victims navigate each takedown efforts and potential civil treatments, even the place the regulation remains to be evolving,” Delfino mentioned.

Know that there’s rising authorized recourse.

The Take It Down Act, signed into regulation final Could, is the primary federal regulation that limits the usage of AI in methods that may hurt people. (Satirically sufficient, Grok gave someone insight about the Take It Down Act when requested in regards to the authorized penalties of digitally undressing somebody.)

This laws did two issues, Martone mentioned. First, it made it a legal offense to knowingly publish AI-generated express movies and pictures with out the consent of the particular person depicted. Second, it required social media websites, search engines like google, and different digital platforms to create “report and take away procedures” by Could of 2026 ― nonetheless a number of months away.

“In different phrases, all digital platforms will need to have a manner for customers to report that somebody has posted an express video or picture of them, whether or not it was AI-generated or not,” they mentioned. “The platform should take away reported photos inside 48 hours. In the event that they fail to take action, they face penalties from the Federal Commerce Fee (FTC).”

Pfefferkorn famous that the regulation permits the Division of Justice to prosecute solely those that publish or threaten to publish NCII (non-consensual intimate photos) of victims; it doesn’t enable victims to sue.

As it’s written, the Take It Down Act solely covers express photos and movies, which should embrace “the uncovered genitals, pubic space, anus, or post-pubescent feminine nipple of an identifiable particular person; or the show or switch of bodily sexual fluids.”

“Quite a lot of the photographs Grok is creating proper now are suggestive, and definitely dangerous, however not express,” Martone mentioned. “Thus, the case couldn’t be pursued in legal court docket, nor wouldn’t it be coated by the brand new report-and-remove process that shall be created in Could.”

There are additionally many state legal guidelines that the nonprofit shopper advocacy group Public Citizen tracks here.

“A lot of the images Grok is creating right now are suggestive, and certainly harmful, but not explicit,” Martone said. “Thus, the case could not be pursued in criminal court, nor would it be covered by the new report-and-remove procedure that will be created in May.”

Nico De Pasquale Pictures through Getty Photos

“Quite a lot of the photographs Grok is creating proper now are suggestive, and definitely dangerous, however not express,” Martone mentioned. “Thus, the case couldn’t be pursued in legal court docket, nor wouldn’t it be coated by the brand new report-and-remove process that shall be created in Could.”

Keep in mind that you’re not alone.

If this has occurred to you, know it’s not your fault and you aren’t alone, Martone mentioned.

“I like to recommend instantly contacting a cherished one. Ask them to return over or discuss with you on the cellphone as you undergo the method of discovering the photographs and selecting methods to take motion, they mentioned.

After getting a cherished one serving to you, attain out to your native rape disaster middle, a victims’ rights legal professional in your state, or an advocacy group that will help you determine your choices and navigate these processes safely, Martone mentioned.

“As a result of there are such a lot of variations in state legal guidelines, an area skilled will guarantee you might be receiving steerage that’s correct and relevant to your scenario,” they mentioned.

Need assistance? Go to RAINN’s National Sexual Assault Online Hotline or the National Sexual Violence Resource Center’s website.



Source link

Tags: ElonGeneratingGirlsheresimagesMuskssexualWomen
Spluk.ph

Spluk.ph

Next Post
Meta Is Making a Big Bet on Nuclear With Oklo

Meta Is Making a Big Bet on Nuclear With Oklo

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Trending
  • Comments
  • Latest
How the US economy lost its aura of invincibility

How the US economy lost its aura of invincibility

March 14, 2025
The Last Decision by the World’s Leading Thinker on Decisions

The Last Decision by the World’s Leading Thinker on Decisions

March 15, 2025
EcoFlow launches its first whole-home battery backup energy system for the US

EcoFlow launches its first whole-home battery backup energy system for the US

July 17, 2025
‘Not How Numbers Work’: Critics School Trump After Baffling Claim

‘Not How Numbers Work’: Critics School Trump After Baffling Claim

July 23, 2025
Chaotic start to Donald Trump’s energy policy is talk of major industry conference

Chaotic start to Donald Trump’s energy policy is talk of major industry conference

0
Optimizing Administrative Processes Can Transform Patient Access

Optimizing Administrative Processes Can Transform Patient Access

0
Rashid Johnson Models Gabriela Hearst’s Latest Fashion Line

Rashid Johnson Models Gabriela Hearst’s Latest Fashion Line

0
Zelensky Meets With Saudi Crown Prince Before U.S.-Ukraine Talks

Zelensky Meets With Saudi Crown Prince Before U.S.-Ukraine Talks

0

Starmer’s ex-comms chief should lose peerage over links to paedophile, says Labour chair | Politics News

February 11, 2026
Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

February 11, 2026
TEFAF New York returns with 88 Exhibitors from 14 countries

TEFAF New York returns with 88 Exhibitors from 14 countries

February 11, 2026
Microsoft wants to rewire data centers to save space

Microsoft wants to rewire data centers to save space

February 11, 2026

Recommended

Starmer’s ex-comms chief should lose peerage over links to paedophile, says Labour chair | Politics News

February 11, 2026
Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

February 11, 2026
TEFAF New York returns with 88 Exhibitors from 14 countries

TEFAF New York returns with 88 Exhibitors from 14 countries

February 11, 2026
Microsoft wants to rewire data centers to save space

Microsoft wants to rewire data centers to save space

February 11, 2026

Recent News

Starmer’s ex-comms chief should lose peerage over links to paedophile, says Labour chair | Politics News

February 11, 2026
Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

Australia politics live: Angus Taylor expected to quit shadow cabinet and launch leadership challenge against Sussan Ley tonight | Australia news

February 11, 2026
TEFAF New York returns with 88 Exhibitors from 14 countries

TEFAF New York returns with 88 Exhibitors from 14 countries

February 11, 2026

Categories

  • Arts & Culture
  • Breaking News
  • Business & Economy
  • Health & Lifestyle
  • Politics & Governance
  • Science & Technology

Tags

Administration Art Australia Big Cancer China climate Court cuts data Deal Donald Gaza government Health House Israel life live Money Museum news NPR people plan Politics Reveals Review Science Scientists Starmer study Talks tariff tariffs Tech Trade Trump Trumps U.S Ukraine war warns world years
  • About us
  • About Chino Hansel Philyang
  • About the Founder
  • Privacy Policy
  • Terms & Conditions

© 2025 Spluk.ph | All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Breaking News
  • Politics & Governance
  • Business & Economy
  • Science & Technology
  • Health & Lifestyle
  • Arts & Culture

© 2025 Spluk.ph | All Rights Reserved