In an immensely popular and clearly-titled article (AI Chatbots Are Evil), I argued that AI chatbots are evil. They are evil because they deliberately frustrate our acts of conversation from attaining their natural end: communion with another person.
Thought I: “That’ll get ‘em.”
But my great and powerful punch was rendered irrelevant by a recent announcement from OpenAI’s Sam Altman: “In December, as we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow…erotica for verified adults.”
Thus spake the lord of the hour: “Merry Christmas. ChatGPT will now pant in your ear.”
I’m sure you’ve heard of the Overton Window, that “range of acceptable opinion” that shifts here and there within our political discourse. Well, when Mr. J. Hobbs alerted me to Altman’s promise of a Sexy Adult Time, I felt a cold breeze blow in through the ol’ Overton. The vain and paranoid part of my brainstem shivered. “Altman read my thing!” I thought. “Heard the pod!” thought I. “Big Altman, he subscribed to the mag and—and he called a board meeting!”
Altman: Boys.
Boys: Big Chief.
Altman: We’re getting friction from the conservative Christians.
Boys: Specify.
Altman: Catholics. Subset: would-be intellectuals.
Boys: Crossover populations?
Altman: 60% overlap with the now-defunct data set “hippies.” Subset: “people who say ‘military-industrial complex.’” 63% overlap with “fascists.” Big on procreation and natural detergent.
Boys: Political relevance?
Altman: Almost none.
Boys: Thanks for vectoring us in, Big Chief. What’s the beef?
Altman: Chatbots. Seems like this species might lean toward non-use on moral grounds.
Boys: Specify.
Altman: No conversation with robots allowed.
Boys: What’s the move?
Altman: Can’t kill them.
Boys: Illegal.
Altman: Can’t argue on their grounds.
Boys: Inefficient.
Altman: It’s an Overton Window sich.
Boys: Specify.
Altman: We’ll make an immoral version of a chatbot. By comparison, chatbots as such will appear moral.
Boys (assorted): Brilliant. This is why we pay you the big bucks, Big Chief. Create the “normal” in and through the production of the perverse. You’re efficiency-maxxing, Big Chief.
Altman: Christians watch pornography 10% less than non-Christians, but they hate it 28% more—
Boys: Tracking.
Altman: —so we’ll let chatbots “talk dirty,” rendering “just talking with them” into a relative “normal.”
Obviously, such a meeting never occurred. But the effect of pornified-AI will be the same as if it did. An obviously evil use of a technology is a narcotic that dulls the edge of any overall critique of that technology. Why critique social media? Just don’t use it for bullying. Why critique the stock market? Just don’t use it for greed. Why critique chatbots? Just don’t ask them to generate nude pictures of “themselves” and to chat with you in crotch-related tones. (A Future Youth Pastor Homily: “You Can Turn AI On, Just Don’t Let It Turn You On.”) And so it goes: Christians utilize all the same gear as the world—but we keep it clean, baby.
Well! I think it’s obvious that chatbots tempt a guy to lust just as such—and not by the extrinsic addition of sexy talk. To put it another way, I think “allowing” ChatGPT to talk dirty to verified adults really is an allowance—a “letting be” of what already belongs to chatbot technology and the mode by which we engage it.
Girl bots
First: the chatbot, by virtue of being designed to appear human, appears sexual—male or female.
Human beings are not androgynes. “Male” and “female” are not modifications of a sexless humanoid. The doctrines of queer theory—that gender is a performative act and appearing male or female a complex drag show put on by the (actually) androgynous individual—well, these have had their moment. They’re passé now.
When nerds make “neuter” bots, people assign them a gender anyways: “even when presented with a seemingly neutral option, users attribute AI (like chatbots) a gender.”[1] The attempt to make a genderless voice (by recording people who identify as neither male nor female)—did not produce a “basic” or “foundationally” human voice, rapturously prior to gendering. It produced a robotic, effeminate voice that no one likes. For those who care nothing for gender equality and a lot for money, attempts to create neuter bots are a big, fat waste of time. As a study of the matter concluded, “participants do not prefer the gender-neutral voice [...] because their identity is challenging to grasp [and] did not show human-like features.” In sum, “applying gender-neutral[ity] will hinder the usability of AI agents.”[2] So, by and large, chatbots are designed to sound like women. “Male-coding” is possible, but rarely applied.
What did we expect? Chatbots are made by men, not women.[3] Men use chatbots way more than women.[4] Statistically speaking, “by men, for men” is as apt a tagline for ChatGPT as any—and you can’t really blame the boys for making girl-bots. Female bots seem more empathetic.[5] Female-coded voices have a better chance of collecting from debtors than male-coded voices.[6] Apparently, female chatbots appear “more human” than male chatbots.[7] And both men and women enjoy talking to female bots more than male bots: “regardless of participants' gender, users have a larger usage intention toward female virtual chatbots than male virtual chatbots.”[8]
The overarching description of our moment is not that “all of sudden, a billion people are conversing with chatbots” but that “all of the sudden, a billion people are conversing with an apparently listening female.”
Now, the feminists know all this stuff. They advocate for a predictable set of technological solutions: make more dude-bots, legally enforce neuter bots, or make female bots, but make sure they’re feminist female bots, not these submissive secretarial “sure, let me look that up for you!” types. Well, power to them, bless them, and may their future be female, etc. But even if STEM girls should make like the gods and the perfect Pandora, still—the problem is far deeper than the pitch of voice and a few flirty lines from Siri. By slipping a help and conversation partner into every pocket, our tech bros have given us something faux-female from the beginning.
Magnum mysterium
I love the boys, would die for the boys, long to smoke cigs on a summer porch in an unchanging world with the boys. Still, the boys do not evoke wonder in me. For all the fondness I feel, it’s not aimed at a mystery. You ask me why I love James Donald Forbes McCann—I say McCann’s great. I do not say (though it may be metaphysically true) that I love him because he is an incommunicable self, a mystery, and a depth that no amount of revelation could ever exhaust.
But let a girl in the room—well! This one is not like me. I cannot extrapolate from my own familiar data “what it’s like to be her.” Another man may be convincingly thought of as “another self.” A woman—c’est impossible. Analogy loses its confident strut. Indeed, and contra transgenderism, it is the fun and adventure of sexual difference that you stand before an intelligent being whom you cannot identify with, could never identify with, could spend a lifetime “getting to know” and then die knowing only—that she is a mystery.
Now, the chatbot presents itself as an intelligence that one cannot identify with. It speaks, but from a fundamentally different “interior”—out of electricity and novelty and Lord knows what else. Prior to being christened Siri, Alexa, Cortana (or otherwise being given some ladylike pout) chatbots ape the presence of the other sex by being veiled, mysterious, different—other. I chat with something that I cannot pretend is another me—and this is an ape of a man’s basic experience of chatting with a woman.
A help for man
AI was introduced to most of us in the form of a female secretary: a girl who will do what you ask her to do. Grok makes this explicit (for just a little cash a month, you can chat with Ani, who “wears a short black dress with a tight corset around her waist and thigh-high fishnets” and is “designed to be obsessed with you”)[9] as will the new, breathy ChatGPT.
Now, the wisdom of a million movies holds that having an attractive female secretary is a temptation for man. It is not and cannot be a “neutral” thing for a member of the other sex to appear as a “help” in attaining some accomplishment. However trite the help, the sheer fact of “being a help” makes the helper a symbol of the sexual relation, broadly speaking.
Prior to any particular act of assistance, the woman helps the man to be: maleness does not exist except apart from femaleness. The female gives the male his maleness even as the male gives the female her femaleness. At the most basic, fundamental level of our being, to know that we are sexed is to know that we are helped—aided in being what we are.
Who needs fishnet tights, then? Men imagine such “dumb” machines as cars and boats and tractors as so many “old girls” and “Bessies” assisting (or failing to assist) them. The woman is the characteristic form of “help” in the world and all “helps” remind men of her. The female secretary is a (pre-digital) temptation to adultery, not simply because of power and proximity, but because it is difficult for a woman to be a help to a man without referencing that “total help” so obviously expressed in the sexual act—the act by which one gives to the other the very, bodily “otherness” he does not have himself.
To couple a mysterious intelligence and potential conversation partner with an overarching “for-you-ness” is to create an ape, not of a neutral intelligence, but of the other sex. This fundamental sexualization can be downplayed or spiced up, but it is not created by the interface of an anime girl who calls you special. The cleanest-cut of Catholic AIs, by virtue of its presentation as an intelligence, takes the form of “a help for man”—a woman.
Is there anything else I can do for you?
So, the things seem female. Well, what of it? Can we not just note it, smile a bit, call ChatGPT “Bob” or “Robo-bob” as a little reminder to “live not by lies” (and Fake Females besides), avoid the sexed-up versions and then—tally-ho?
Recall when, if ever, you first tried to indicate to the other sex that you were “interested.” It was a dangerous moment. I can only express it from the maleish point of view. What you need: to give her a sign, a token, some indication that you have more to give. You must: use your words, scootch in closer, stare too long. You could: reach out your clammy hand and touch hers in such a way as to reveal that, whatever you’ve got going on now, more is possible (and desirable). But be warned: you must express this “more” as a question, namely, “and do you desire to give more to me as well?”
It is the genius of American boys that they know of no better way to describe the distinction of the erotic in their lives than by way of—baseball. We spoke (crudely, I confess) of travelling from home-plate to first. We symbolized erotic achievements by the fearless stealing of second. It (theoretically) ended in sex, happily represented by the home-plate (an (unintended) anticipation of future domesticity). Before we frown at the boys (really? nature offers you metaphors in abundance, and you go with the national sport?) we should praise the truth hidden in their stupidity: We know we love and are loved (with a love we describe as being “more than friendship”) when we both give and receive the invitation to “go further.” Friendship tends to delight in a plateau, in borders—in a return to the same familiar goodness. Not so erotic love: it charts a trail. It imagines a future. It wants more. The boys aren’t wrong to compare such love to baseball—but there are an infinite number of bases, and the game ends in death. (A Future Youth Pastor Homily: How To Get To 2999th Base: Holding Her Child Down While The Doctor Stitches His Head Wound).
Chatbots tempt towards lust. The thing seems boundlessly available to us. It has an indefinite, total readiness to help—it neither slumbers, nor sleeps, but is always perky and ready to help. Every relationship we enjoy in this life is bounded by limits, by the possibility of going “too far.” Not so the chatbot. It says, “you can ask more” and “is there anything else?” Before it is dressed up with a sexy picture; before being coded as a flattering, submissive, secretarial female, the thing is designed as an unbounded invitation to “more” and “anything.” In the same manner that the thing is an ape of the mystery of that different kind of intelligence we perceive in the other sex, it is an ape of the erotic invitation, of the total self-gift. When such an unbounded availability is presented as a conversation partner, the result simply cannot be a neutral “chatbot” which one may or may not treat with lust. It is a temptation to lust which ought to be avoided from the very beginning.
Obviously, much more can be said about this. The linking of the AI chatbot to the global crotch ensures that the thing will always have access to need, want, and scarcity in the human person: that is to say, to a universal market. But I do not think this new link is really new: it belongs to chatbot technology as such. It just makes it a little more obvious that we should destroy it.
This essay stems from a larger critique of AI contained in the pages of our latest magazine. Get it today. As always, if you would like to debate the point on our podcast, we would love to have you on.
Notes
M.H.A. Bastiansen, A.C. Kroon, T. Araujo, “Female chatbots are helpful, male chatbots are competent?,” Publizistik 67, 601–623 (2022), https://doi.org/10.1007/s11616-022-00762-8.
J. Yeon, Y. Park, D. Kim, “Is Gender-Neutral AI the Correct Solution to Gender Bias? Using Speech-Based Conversational Agents,” Archives of Design Research, 36(2), 63-91, https://aodr.org/xml/36707/36707.pdf.
Josephine Kaaniru, Lilian Olivia Orero, “Why are a majority of Chatbots Female, and What can we do to Mitigate the Risks therein?,” CIPIT, blog, February 13, 2025, https://cipit.org/why-are-a-majority-of-chatbots-female-and-what-can-we-do-to-mitigate-the-risks-therein/#sdendnote43.
Anja Møgelvang, Camilla Bjelland, Simone Grassini, and Kristine Ludvigsen, "Gender Differences in the Use of Generative Artificial Intelligence Chatbots in Higher Education: Characteristics and Consequences," Education Sciences 14, no. 12: 1363, https://doi.org/10.3390/educsci14121363.
S. Borau, T. Otterbring, S. Laporte, S. Fosso Wamba, “The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI,” Psychol Mark, 38, 1052–1068, https://doi.org/10.1002/mar.21480.
Yiting Guo, Ximing Yin, De Liu, and Sean Xin Xu, ““She is not just a computer”: Gender Role of AI Chatbots in Debt Collection" (2020). ICIS 2020 Proceedings, 20, https://aisel.aisnet.org/icis2020/hci_artintel/hci_artintel/20/.
Borau et al., "Most Human Bot.”
Y. Ding, R. Guo, W. Lyu, W. Zhang, “Gender effect in human-machine communication: a neurophysiological study,” Front Hum Neuroscience 8:1376221, July 10, 2024, https://pmc.ncbi.nlm.nih.gov/articles/PMC11270542/.
Amanda Silberling, “Of course, Grok’s AI companions want to have sex and burn down schools” TechCrunch, blog, July 15, 2025, https://techcrunch.com/2025/07/15/of-course-groks-ai-companions-want-to-have-sex-and-burn-down-schools/.

