Written by Miracle Okah
Imagine posting a beautiful picture of yourself on X (formerly Twitter), feeling confident. Then, a stranger gets hold of it and asks Grok, X’s chatbot, to turn your picture around so they can see what your body might look like from behind.
But they don’t stop there; they go further by asking the bot to undress you publicly. Grok, thankfully, has some restrictions so it can only create an AI picture of your back but will refuse to generate a nude image. So they turn to another AI tool, one without boundaries, and it does exactly what they want. Then, they return with a fake naked version of you and post it online. This image spreads; people share it, comment, laugh, and sexualise you. All of a sudden, you go viral for something you never consented to. Yup, that is where we are now.
The scary thing about all this is that most people can’t tell it is fake unless someone points it out.
Only you know you didn’t post an explicit picture; nobody else can tell or vouch for you.
If this sounds like a dystopian tale that seems too terrible to happen, you might have to think again.
On the 2nd of June, 2025, a woman posted a photo of herself online. Shortly after, men like @iamelycruiz1 and others started by asking Grok to remove her seatbelt. Then, others joined in and demanded that Grok undress her. The chatbot refused, and after many failed attempts, @itslilberry shared a naked AI-altered image of the woman. And in less than 24 hours, it went viral. She was not the only victim; she was just one among many.
Before AI, before Grok, before deepfakes and all sorts of prompts, women’s bodies were already seen and treated like public property. Something a man can just pick and own. A man could catcall you on the road, not because he knows you but because he believes your body was made for his viewing pleasure.
It is the same logic that allows a teacher to joke about your chest in front of the class. That makes your boss stand too close, that gives a stranger the guts to flash you in public, that makes a conductor slap your butt and call it an accident, and people will tell you that you are making a big deal out of nothing.
Offline harassment has always been part of women’s stories but now it is online with new upgraded technological tools, but still with the same mindset.
Before, just one man could harass you on the street; now more than a hundred can do it anonymously from their phones, across different time zones. You might not even know it is happening until someone tags you in it. The same audacity and entitlement that makes them whistle at you in traffic is the same one pushing them to strip digital clothes off your image.
If we are being honest, we all know that women have never been safe. Not in schools, on buses, at work, or even in our homes. Right from childhood, we have been taught to never let a man touch our private parts, to walk with our keys between our fingers, to scan the passengers before entering a cab, and to text someone before and after we leave home.
We have built our lives around caution so as not to be seen enough to be targeted. It has been exhausting and for a while, social media felt like our safe space from all that. It was a place where we could take a break and finally breathe. Somewhere we could control our image however we want, without constantly looking over our shoulders. Now we come online and experience the same thing we have been running away from.
I spoke to several women about how this new AI-enabled abuse made them feel and their outrage was expected.
“I am very ANGRY! I feel violated and it is not even my pictures that they are doing things to.” Jennifer Ayogu, a product designer, said.
“I am even more annoyed that there are tools that exist for such ridiculous reasons and these social media platforms have been doing nothing to stop these perverts.”
She continued by saying she was afraid and really disappointed. “As a woman, it is already bad enough that we have to deal with perverts in real life. Now we have to be wary of them online too? I feel helpless and defeated because there’s really nothing I can do. Do we now stop posting pictures online?” she asked rhetorically.
Jennifer concluded with a plea, “I do not feel safe online again. I just hope all the parties involved will do something to stop these people as soon as possible because it’s getting out of hand.”
Famakinwa Faith, a nurse at UCH in Ibadan, described her feelings as “terrifying”.
“There’s something deeply invasive about the idea that a picture shared online could be twisted into something horrible. It feels like a digital violation.”
What unsettled her the most, she explained, was how believable the altered images were. “The AI-generated images are so realistic. Denying it and trying to prove your innocence as the victim is not fair and it is such an exhausting position to be in.”
Faith admitted that nothing about Twitter men surprised her anymore. “At this point, there’s almost nothing Twitter men could do that would shock me. They’ve shown over and over again just how shameless and repulsive they can be. But what did shake me was the realisation that some random stranger could take a photo of me, generate a fake naked version and that image might end up getting more engagement than the real one.”
She continued, “Now, should I start second-guessing before posting online? I wonder if I’m putting myself at risk just by being visible. And it is not just about me. Should I be concerned about my little sister posting her pictures online too? What if someone does the same to her? It is like the internet is no longer a place to express yourself. It has become a place where you are constantly exposed, even when you’re fully clothed.”
She said she stopped feeling safe online a long time ago, which explains why she rarely posts on any of her social media accounts. “The internet was supposed to be this space for connection and expression but it feels more like a hunting ground for so many women. If you are all for body positivity and want to show some skin, they’ll come for you. If you’re fully clad and minding your business, they’d still come for you. We are expected to accept a level of exposure and risk that men rarely have to face. And until that changes, feeling safe online will continue to be a privilege and not a normal thing.”
Atinuke, a freelancer, described the experience as “all shades of wrong”. “Just knowing that someone could use AI to create a fake, naked version of me is something I can’t even begin to explain how violating and scary the thought of it makes me feel. It feels like being stripped bare.”
Atinuke said she is blocking off toxicity and muting users who engage in such behaviour.
“I am curating my timeline to be with people I believe to be reasonable while hoping they don’t find a way to get to me someday. It’s hard, though; every day I see women going through a form of violation, with many people saying it’s not deep.”
I asked her if she has ever felt safe online and she sadly asked, “Lol, are we ever safe as women?”
Busayo, a young woman who works in the marketing department at UBA, said she feels too scared to post pictures online. “It shows the world is becoming scarier and it makes me not want to upload my pictures because if I do and someone does that to me, I will feel dehumanised.”
She mentioned that even the thought alone is enough to make her retreat from the internet. “It is becoming harder to deal with. I just don’t want to be seen anymore.”
These incidents didn’t just start today; they started as far back as 2019 when someone built an app called DeepNude. With just one click, it could strip clothes off women in any photo and make it look real. It was so convincing that someone could take your face and put it on a naked body and pass it around like it belonged to them. What I find bizarre about this is how the app could only do it to women; if you upload a man’s picture, it would just change his genitals.
Even though the creator shut it down after much outrage, the code was already there in public and one could easily copy, tweak and share it on the internet and from there it got worse. After this, we had deepfake porn that started by targeting celebrities and then moved to regular women who just posted a cute selfie online.
People will say, “Don’t post if you don’t want attention.” but we all know that harassment has always had nothing to do with what we post but everything to do with the men who feel entitled to commit atrocities.
Now how can we tackle this? How do we protect ourselves online when the violations no longer need physical contact?
I asked each woman what steps they would like to see.
Jennifer said, “Social media accounts of these perverts should be banned and they should be arrested immediately. The tech companies should be sanctioned for even building such tools or including those features and they should be forced to disable them. There’s so much AI can do; why are these companies not even training them to flag down these accounts or content posted by these perverts instead of creating them to cause harm to others?”
Atinuke agreed. “This shouldn’t even be something an AI should be permitted to do; it’s a serious infringement of our rights as women. There should be a way to program these AIs to say NO to such prompts.”
Busayo added, “There should be a law for this. Something like: ‘Anybody who uses AI in this kind of way will be arrested and will be given a harsh judgment.’ We need consequences.”
Faith stressed that tech companies need to stop pretending that they’re neutral platforms. “They’re not. They shouldn’t be granting useless requests like that. They have the power to detect, trace, and take down AI-generated abuse, and they should be held accountable when they don’t.”
She continued, “Also, the laws need to catch up. AI is growing fast but the legal system right now moves at a snail’s pace compared to AI development. Harassment with the use of AI needs to be criminalised, not just as defamation or image misuse but as a form of gender-based violence. It is the men doing this and as usual, their target is always innocent women who choose to make use of the internet too. And as a society, we have to stop treating digital abuse like it’s less real than physical harassment. The trauma is real and so is the damage.”
Just like Faith said, when people dismiss it as “ just AI” or say, “It is not real” they ignore the consequences. The humiliation, trauma and loss of safety these women feel are real.
As women, we must stand together and fight this collectively by calling out offenders and publicly shaming them. Feminists have already begun taking action. Recently, AlexVivy Nnabue called out one of the men involved by posting his identity on both X and LinkedIn. He had asked Grok to remove actress Kehinde Bankole’s clothes, but after being exposed, he privately begged her to take the post down. He claimed he had just landed a contract and his employer had seen the post. Now, this kind of public accountability can be powerful especially when men realise that their online behaviour can damage their reputation and opportunities in real life.
At the same time, tech companies and developers need to restrict people from making such demands and taking advantage of women.
We need the law to catch up with the pace of technology in all countries. Earlier this year, the UK passed a new law that makes it a criminal offence to create sexually explicit deepfake images of adults without consent. Also, in the United States, President Donald Trump signed the “Take It Down Act” into law. This is a policy that allows victims of non-consensual intimate imagery to report the content and have it removed within 48 hours. Offenders can also face up to three years in prison.
Other countries, especially in Africa, must begin to take this issue seriously. It is time to introduce a similar law to protect women from this new form of digital violence. Because without legal consequences, the abuse will continue growing until it is too late.
About the Author:
Miracle Okah is the first daughter of two teachers. She initially dreamed of becoming a doctor but ultimately found her true calling in writing, where she discovered the power of words over stethoscopes. Passionate about African literature and amplifying the voices of Black women, her work has been featured in Amaka Studio, Black Ballad, Better to Speak, Black Girl X, and beyond. She is on the writing track for the 2025 Adventures Creators Programme.