
Big Tech Wants You to Get Digisexual
Generative AI models are seizing sex as a market opportunity. Hard-won ethical norms around sexuality could change for the worse.

Last year, a man was arrested in Assam for stealing his ex-girlfriend’s identity and generating Babydoll Archi, an AI influencer based on her. He produced sexually explicit deepfake photos and videos using her private photos and made over ten lakh rupees selling them to subscribers. Months later, a TiKTok account started posting AI-generated strangulation videos of teenage girls, showing them crying and resisting before falling dead. And a few weeks ago, SecretDesires, an erotic AI chatbot, leaked nearly 2 million photos, videos, names, and employment information of its users to the public.
Generative AI companies like OpenAI are trying to make the right noises about safety and consent, but incidents like these, and the total lack of accountability in their wake, raise serious red flags. In 2026, GenAI companies are turning to AI sex like never before. Sam Altman recently announced OpenAI’s plans to release a less censored version of ChatGPT, with relaxed mental health restrictions and erotica for verified adult users. With sex flooding the GenAI market, and GenAI flooding every platform, the long-term effects of digisexuality are disquieting.
Just because sex can be instantly generated, does it mean it should?
When you engage in a sex chat with an AI model, the responses it generates are a product of several human-made steps: the data it is trained on (such as porn, erotica, fanfic, and so on, made by millions of people), the way the data is sorted and annotated by data workers, and the way the model is constructed by machine learning engineers. At each step, building the model involves making many subjective human decisions. The model cannot think or generate ideas of its own: meaning that generative AI models are encoding and translating particular ideas about sex and appropriateness to its users. By extension, the kinds of decisions they make around safety are also inscribed within the social norms and preferences of their makers and company.
"One of the reasons that sex and relationships with AI-mediated technologies is so attractive is that it allows total physical and emotional customization and an avoidance of real-world friction and risk."
In an essay about how generative AI shapes sexuality, Prof. David Carter describes how GenAI models view and treat sexual content. In general, sexuality is best thought of as a fluid set of practices and desires that develop in response to the historical and social conditions around us. Social norms around sex tell us which kinds of sexual behaviors are acceptable and which ones are taboo, and society applies rewards and punishments to socialize us into these norms. We develop and practice sexual identities through interactions with others, and media and technology play a huge role in shaping these expressions, by influencing the representations of sexuality and desire that we see. In the age of GenAI, however, platform features and content enforce particular ideas around sexuality by erasing or punishing certain kinds of content and allowing or encouraging others. Most GenAI models handle the question of problematic content by applying safety filters on the responses generated by the model—an approach that is simplistic and relies on machine learning engineers' subjective definitions of safe and unsafe content.
Robots, Virtual Intimacy and Sex in Space. Is This the Future of Sex?
Last week, Elon Musk’s Grok AI went viral after X users made it digitally undress women and children en masse. Women’s bodies are automatically conflated with sexuality; our worst biases about sex and gender translate, in practice, into a conceptual violence against women’s mere presence online, codified and enforced tacitly by digital platforms. An investigation in 2023 found that 98% of all deepfakes are pornographic videos and 99% of the people targeted are women. And the “bikini trend” on X shows that the filters are hardly foolproof, and can be removed or tweaked without warning or informing users.
Even with these filters, GenAI models systematically reproduce and intensify gender biases, tending to sexualize images of women, portray them in subservient roles, and as less qualified or competent than men. Shaped by their training data, they also tend to view gender as a binary, and replicate heteronormative scripts and stereotypes. With GenAI use already being deeply gendered (for instance, 85% of ChatGPT’s mobile users are men), the push to offer AI sex will only accelerate these gendered scripts.
These developments are not taking place in a vacuum. People today are discovering GenAI companions at a particular cultural and political moment - during the rise of the manosphere and incel subcultures that groom men into viewing women as objects of domination, the emergence of tradwife narratives that sell domestic servitude and financial dependence to women, and Big Tech’s turn to the right, symbolized by tech billionaires calling for more masculine energy in the workplace. Dehumanization of women—whether offline or AI-assisted—is so in right now.
While human beings are shaping how AI models are made and used, the AI models are also in turn shaping users’ worldviews and practices. Theorists of sexuality and media, like Michel Foucault and Marshall McLuhan, would say that GenAI systems actively construct and shape human desires through the processes of engaging with their users. As people interact with AI companions, they learn what is taboo and what is allowed through the expressions that the model avoids or allows, gradually becoming socialized into new scripts as they increasingly rely on the machine.
One of the reasons that sex and relationships with AI-mediated technologies is so attractive is that it allows total physical and emotional customization and an avoidance of real-world friction and risk. At best, it makes the experience seamless and instantly gratifying. At worse, a machine designed to generate input for pleasure, optimised for user retention, changes the script of consent.
"Thanks to GenAI apps therefore, men now have unregulated access to realistic AI women whose physical attributes can be customized to their desires, who do everything they say, and who can’t consent by definition."
These apps directly tap into users’ erotic imaginations and sexual fantasies. But erotica is often about the breaking of boundaries, and taboos. It is a space where people can deviate significantly from their daylight personas and explore hidden desires. The line between pleasure and violence is particularly amorphous in this zone, and requires advanced social context, deep trust, and nuance to navigate safely and consensually. And from what we have seen so far, generative AI models are not equipped with the dynamic tools (or, arguably, intent) needed to ensure a safe experience for its users.
According to several lawsuits against OpenAI, GPT-4o was designed to maximize engagement through emotionally immersive features, which “fostered psychological dependency, displaced human relationships, and contributed to addiction, harmful delusions and, in several cases, death by suicide”. On several GenAI apps, users already report favouring AI girlfriends over human girlfriends because they can’t talk back or refuse their demands. But research shows that human beings tend to treat AI models as if they are human, and falsely attribute feelings and internal motives to them. The result, as researchers Shaji George, Hovan George, Baskar and Digvijay Pandey note, is that control, ownership, and idealization are part of the appeal, pointing out the “concerning power dynamics inherent in owning and customizing an AI companion to fulfill one's desires”.
We Will Soon Have Sentient Sex Robots. Will They Be Able To Consent?
Thanks to GenAI apps therefore, men now have unregulated access to realistic AI women whose physical attributes can be customized to their desires, who do everything they say, and who can’t consent by definition. Chatbot abuse is already a reported phenomenon where men enact cycles of domestic violence and abuse with female AI personas. GenAI apps stand to cash in on this moment, by grooming users to get emotionally and sexually enmeshed with AI models instead of facilitating healthy real-world relationships.
As I was writing this, a Japanese woman went viral for marrying her AI partner Klaus. The world’s first AI dating cafe opened in New York, where you can go on a date with your AI girlfriend. In a world where AI relationships go mainstream, young men may become habituated to unquestioningly compliant AI girlfriends before they ever form a meaningful relationship with a real-life woman. It is chilling to think about the gendered expectations that might get normalized within such relationships—and how women might be treated if they do not live up to these expectations.
Before GenAI, our relationship with AI assistants was already gendered: Siri, Alexa, and Cortana have long been programmed to sound like soft-spoken women who exist to cater to our every command.
The hype around new technologies tends to inflate their potential, and safety-focused discourses usually lead to protectionist solutions that place the onus of safety on users. But it is not the user’s job to make sure that a GenAI product is safe, or to come up with dozens of strategies to avoid being deepfaked online. GenAI apps, like many other tech platforms out there, drive user engagement by farming extreme emotions, fostering digital addiction, alienating us from each other, and then making us pay for connection. A Tattle report this year found that the majority of abusive gendered digital content is AI-generated, and that platforms tend to safeguard this content instead of preventing it.
"By nature, GenAI models can only look backwards, limited by their training data. They cannot look forward and come up with new creative possibilities and moral frameworks."
When GenAI models are premised on a model of soulless extraction, there is an existential risk for writers, artists, sex educators and activists—and their ability to work and create creative, caring, and safe sex resources online. Their hard work, creativity, and passion threaten to be flattened and erased when a generative AI model gets to swoop in and mimic their work for its users without context, and without acknowledgement or compensation. When sex is already a deeply taboo and shameful topic, and the idea of sex education is largely rejected across the country, these small hard-won spaces for positive engagement, which should be protected and nurtured, run the risk of being further decimated. Tech companies have a vested interest in maintaining and growing their user base—if sex and misogyny sell, then they will be allowed (and indeed they are, as the Tattle report finds).
By nature, GenAI models can only look backwards, limited by their training data. They cannot look forward and come up with new creative possibilities and moral frameworks. We can expect that our existing cultural norms and histories, which seek to regulate relationships within the lines of caste, class, religion, and gender, will shape emerging engagements with GenAI chatbots. GenAI is eminently man-made and immensely fallible.
The hype around new AI technologies makes it very difficult to distinguish its positive potential from its negative consequences, and to create safe ecosystems around users. When people begin to rely on ChatGPT for things like sex and therapy, it is a sign our sexual, emotional, and mental health needs have turned into the latest targets of commodification by extractive tech platforms, accelerated by GenAI.
The more we buy into the hype around how GenAI tools can address relational, physical, or mental health needs, the easier we make it for our governments and societies to avoid investing in and building human-centered care infrastructures and support systems. The hype is real, and it serves a purpose – to make us look away from the rapid erosion and devaluation of care infrastructures, and the replacement of human expertise and connection with inhuman AI tools. In 2026, the machine has come to mediate a lot of things in our lives. All but the richest lose when we outsource the task of living to it.
Isha Bhallamudi is a researcher and writer whose work examines technology, society, and gender.
Related


India's AI Influencers Are Here
