Grok, Deepfakes and Where We Go From Here

By: Esti DeAngelis  |  February 20, 2026
SHARE

By Esti DeAngelis, Managing Editor

On December 24, 2025, Elon Musk announced the rollout of a new X feature: Grok, the social media platform’s artificial intelligence (AI) assistant, could now digitally alter any photo or video posted on the app. Within a matter of days, a horrifying, viral trend swept the platform: users were asking the AI to digitally undress women and children whose photos were posted on the site. Soon, already damaging requests like “put her in a bikini” devolved into commands to manipulate photos in ways so sexually degrading they are unfit for print. 

Because Grok is integrated into the larger X social media app and is not a standalone AI program, all a user needed to do was tag the AI assistant underneath any post and the nonconsensual, sexualized image he had requested would be generated as a reply. What followed was a public, nearly two-week long humiliation ritual, as innocent pictures of fully-clothed women were degraded. X was slow to respond. It took until January 9 for the image generation and editing feature to be turned off for non-paying subscribers. Days later, Grok announced it would no longer allow photos of real people to be edited in sexualized ways in places where such nonconsensual content is illegal. 

It was in many ways too late. According to research conducted by the Center for Countering Digital Hate (CCDH), from December 29 to January 8 around three million sexualized images were generated on the app. While there were OnlyFans models using the feature to draw attention to their own pages, there is nothing to suggest that any significant amount of this content was generated consensually. Further, CCDH’s research estimates that more than 23,000 of the sexualized images appeared to depict children.

Amidst the Grok disaster, governments across the world took action, temporarily blocking or threatening to block the AI tool. Some launched investigations into both Grok and X more broadly. But the problem hasn’t been solved by governments’ nor X’s crackdowns. Reports suggest that the standalone Grok Imagine app, disconnected from the X platform, will still comply with requests for nonconsensual sexualized images. 

Moreover, this problem is much larger than Grok but extends to other image-generating and editing softwares as well. It is only Grok’s integration into a mainstream social media app that dragged a perverted and predatory hobby as old as the internet into the light. Because of this, the conversations surrounding Grok are similar to ones that have become increasingly common in recent years. They center around this: What safeguards can be implemented to better protect women and children in an age when technology moves faster than the law?

This is an important discussion. It raises further questions, ones about regulation, legislation and who’s to blame when AI becomes an accomplice in what amounts to digital sexual violence. But if this is all we talk about, there will always be another version of Grok-gate. We will always be playing catch-up.

That’s because beneath these questions lurks a deeper, darker one: How did we get here? What I mean is, how did we get to a place in society where thousands of men feel perfectly comfortable publicly violating women on social media? Does a willingness to hide behind anonymous profiles mean the only thing holding these men back is fear of ostracization? 

There have always been bad men. Maybe this isn’t as big of an anomaly as I’m making it out to be. But if my instincts are right, a large portion of these men would never think of assaulting a woman in real life. Somehow, this kind of deeply violating cybercrime has been mainstreamed, and a certain social taboo around dehumanization has been lifted. This speaks to a human problem, one that laws and regulations, though necessary, can stifle but not eliminate. 

On the most basic level, it’s about consent. AI deepfakes are a digital violation, resulting from either an inability to understand why deepfakes are wrong (“It’s not really her body AI is generating”) or not valuing the victim enough to care (“That’s what happens when you put your picture online”). It is self-centered and dehumanizing, and the first step toward the hands-on violence so many of these men say they would never dare commit.

But if digital sexual violence is so clearly one step away from physical sexual violence, what does it mean for us that so many have found themselves engaging in the former type of abuse? The mainstream feminist movement will insist that this only means we need to talk about consent more. I agree that we do, though I think that conversation should be coupled with a heavy dose of longer prison sentences. But, and this may sound radical in a culture steeped in fourth-wave feminist ideals, talking about consent is not enough.

Consent teaches an important lesson about intimacy, one that I hope is obvious. But a hyperfixation on only consent means that all acts are values-neutral so long as everyone agrees to them. Hookup culture is celebrated. So are polyamory and degrading sexual activity and pornography. You had a consensual one-night stand but feel used anyway? Our society has no answer for that. What do we tell young men about young women when this is the world we live in? 

I want every man who created a nonconsensual sexualized image of a woman or child to be penalized. Heck, we should scream the word “consent” over and over again in their ears if it will help. But I also want these men to be taught that degrading a woman in this way is wrong even if she consents to it. Because sexualized AI imagery is wrong not only because of consent, but because a woman is worth more than that. A culture that cannot recognize that worth is grooming women to be harmed by men. 

I will repeat this point ad nauseum because I will be accused of victim-blaming if I do not: OnlyFans models and a hypersexualized culture are no excuse for sexual violence. You stop rape not by telling women to cover up but by putting rapists in jail. But this doesn’t mean we shouldn’t do everything in our power to cultivate a society that values and protects women. 

When a society breaks down to the degree that ours has, it is always the vulnerable who suffer first. Legal guardrails disappear, but so do cultural norms and standards. Boundaries are pushed into nonexistence. The unthinkable becomes thinkable. 

I want a better world for women. I want to live in a society where taking a woman’s image and abusing it becomes unthinkable again — because consent matters, but also because women’s souls matter. That is, if our society believes people even have souls to begin with.

Photo Credit: Unsplash

SHARE