The surprising surge of AI girlfriends and AI boyfriends: Should we be concerned?

While most people fixate on economic upheavals and existential threats, a quieter AI revolution has taken root—in the realm of romance. For about the price of a pizza a month, anyone can subscribe to an AI girlfriend or AI boyfriend that never forgets an anniversary or drifts off mid-conversation. How, exactly, did we land here—and should we be alarmed?
Relationships thrive on shared stories, subtle gestures, and steady care. Whether it’s a schoolyard love or a newly met colleague, this person shares your joys, shoulders your worries, and knows your daily quirks. Still, one wonders: what if there were a partner that never missed an anniversary, anticipated every need, and never tired?
That “perfect partner” has, in fact, been long in the making. The phenomenon of AI girlfriends and AI boyfriends is nothing new.
The emergence of AI romance. It all started in the 60s
The first AI partner. Her name was Eliza
The notion of turning to machines for companionship can be traced back to the 1960s, when Joseph Weizenbaum, a German-American computer scientist at the Massachusetts Institute of Technology, created ELIZA, one of the first programs that convincingly simulated human conversation. In his 1966 journal article, Weizenbaum published the following transcript:
User: "Men are all alike."
ELIZA: "IN WHAT WAY"
User: "They’re always bugging us about something or other."
ELIZA: "CAN YOU THINK OF A SPECIFIC EXAMPLE"
User: "Well, my boyfriend made me come here."
ELIZA: "YOUR BOYFRIEND MADE YOU COME HERE"
User: "He says I’m depressed much of the time."
ELIZA: "I AM SORRY TO HEAR YOU ARE DEPRESSED"
Although ELIZA was limited by today’s standards, it demonstrated how text-based interactions with a machine could mimic aspects of a therapeutic or friendly dialogue—foretelling the appeal of today’s AI girlfriends and boyfriends. One story recounts how Weizenbaum’s secretary asked to speak with ELIZA in private, urging Weizenbaum himself to step outside. The incident unsettled him; if a simple script could win such confidence, he concluded, more advanced systems might tempt users into even deeper—perhaps unwarranted—emotional investment.
Weizenbaum later became something of an AI heretic, clashing with the very AI community he had once helped to shape. A decade after unveiling ELIZA, he published the book Computer Power and Human Reason (1976), in which he forcefully condemned the idea that computers could—or should—supplant humans in matters demanding empathy or moral judgment. Of course, romance includes both those things.
Yet his cautions were eclipsed by the mounting enthusiasm that propelled AI research forward. As it often goes whenever humans encounter a transformative innovation—be it single-use plastics or the smartphone—we embrace novelty wholeheartedly, and only later measure its real costs.
Now, with AI presenting itself as a companion—an AI girlfriend here, an AI boyfriend there—many find Weizenbaum’s early cautions echoing louder than ever.
AI friends, AI girlfriends, AI boyfriends - Are they here to stay?
Today, the transcribed conversation from Weizenbaum’s 1966 paper feels as antiquated as a silent film projected in an IMAX theatre. In 2025, chatbots don’t merely parrot keywords; they juggle context, craft persuasive narratives, and adapt their tone to an uncanny degree. They can paint like Picasso, rhapsodize like Shakespeare; they can console, advise, and even flirt. But can they feel? Or are these flirts mere illusions—scripts stitched together by statistical precision?
AI partners can be good for you, but they do not care about you
No matter how well modern chatbots—an AI girlfriend, for instance—might fool an unsuspecting conversational partner, “she” remains unable to truly feel. Hence, whether “she” or “he,” (the chatbot) cannot provide the deep empathy that underpins genuine human connection. Chatbots excel at “thinking,” not “feeling,” and are thus well-suited as an AI assistant—but lacking true emotion, they inevitably fall short of replacing our flesh-and-blood partners.
“No other organism, and certainly no computer, can be made to confront genuine human problems in human terms… They cannot, in short, be made to care,”.
Decades of breakthroughs in neuroscience and AI have done little to refute his central claim: while algorithms can replicate the structure of human dialogue, they do not share the subjective experience we call emotion.
Still, they are here now—AI girlfriends, AI boyfriends, AI therapists, AI friends—some subscription-based, others free—arguably more widespread than Weizenbaum could ever have imagined. An analysis by App Radar and SplitMetrics shows these apps have collectively amassed 225 million downloads on the Google Play Store—enough to give every resident of Nigeria, Africa’s most populous country, their own AI girlfriend or boyfriend. And that does not take the Apple Store into account.
So perhaps the real question isn’t whether your AI girlfriend or AI boyfriend actually cares about you, but whether you mind that they don’t—so long as the benefits still feel real.
What are the positives of AI romance?
Despite worries about unintended consequences, AI partners can offer real advantages. For those dealing with social anxiety or isolation, a chatbot’s 24/7 availability can provide a safe, judgment-free space to rehearse conversations, explore emotions, or simply vent. Unlike traditional relationships, AI partners always respond immediately—nudging users to reflect on their mood, try coping strategies, or even take a brisk walk.
AI partners can be great for empathy and guidance
AI partners also help expand access to emotional support. In communities with overburdened mental-health services—or where “talking to someone” is stigmatized—an AI partner, whether romantic or not, can be the only “ear” at hand. Others appreciate the comfort of an always-ready confidant, whether it’s engaging in mock banter over morning coffee or exchanging lighthearted messages at day’s end. For these users, an AI partner offers a steady source of connection in a world that often feels anything but.
In one Guardian article, researchers describe a study conducted in January that surveyed over 1,000 students using an unnamed AI partner chatbot. Roughly half considered it a nonjudgmental AI friend and reported less anxiety and stronger social support. Nearly a quarter said it sparked positive changes in their thinking or behavior, while 30 participants credited it with preventing them from attempting suicide.
These findings suggest that, perhaps, the real value in AI partners lies less in simulating romance and more in offering a steady source of empathy and guidance. Yet that boundary can be difficult to pin down. This brings us, naturally, to the more unsettling questions around AI romance.
The dangers of AI romance
Your AI girlfriend or AI boyfriend can erode the real-world connections you need most. We are already on the path to estrangement in our ever-online era—scrolling past loved ones’ milestones with a casual “like,” spinning highlight reels of our lives as if they were movie trailers. Now, AI-driven romance takes that detachment a step further.
Psychologists caution that digital engagement with AI girlfriends and boyfriends can foster an illusion of genuine emotional connection, leading some users to neglect real-world relationships and personal growth. Others warn that over-reliance on an AI partner could reinforce unhealthy attachment patterns, making it harder to navigate human interactions with all their inherent complexity. Privacy and data security also loom large: to refine the “relationship,” an AI application often collects sensitive user information, raising concerns about who controls—and potentially profits from—those intimate details.
In short, while AI-driven romance can seem an appealing shortcut to companionship, it may threaten the very spontaneity, vulnerability, and mutual understanding that give real love its depth.

Isolation

Dependency

Derealization
Where code meets compassion: Should we be concerned?
Yes, and no. But we need to keep talking about our AI partners
So here we stand, where code meets compassion. Analysts question whether AI girlfriends, AI boyfriends, and AI friends will remain supplementary tools for support—employed when needed but not replacing human relationships—or whether their allure could steer some users away from traditional intimacy. Over time, further research and shifting social norms will likely clarify the extent to which these technologies alter how people form connections. For now, it is a collective task—across developers, policymakers, and users—to balance the potential advantages of AI companionship with the foundational elements of human interaction.
Like countless innovations—whether it’s digital currencies or smartphone addiction—your AI girlfriend or AI boyfriend is neither purely good nor wholly bad for you. The key is to keep the conversation alive.

Wail El Badaoui
Wail is a seasoned Product Manager with over 7 years of experience working remotely. Specializing in building and optimizing AI-powered products. With a deep understanding of the challenges and rewards of remote work, Wail is passionate about leveraging AI tools to simplify workflows, boost productivity, and create a more balanced work-life environment. When not streamlining user experiences, Wail enjoys experimenting with new tech, fine-tuning productivity hacks, and sharing insights on optimizing remote work.