The Gold Digger in Your Phone: Why AI Relationships are Corporate Romance Scams

As a psychologist and cyberpsychologist, I'm increasingly asked to comment on 'AI companions' or ‘AI relationships’: dedicated platforms like Replika and Character AI as well as general-purpose chatbots (like ChatGPT) that users repurpose to simulate intimate connections.

If your news feed looks anything like mine, the popularity of AI relationships is burgeoning. Surveys, sometimes conducted by the very companies selling these products (insert cautionary note), claim to show widespread use or, at least, openness to the idea of AI friends. At the fringes of this phenomenon are reports of people becoming so attached to their body-less lovers that they get engaged or 'married'.

The journalists who ask me about this usually frame their questions around whether these phenomena are good/bad, or healthy/unhealthy. They wonder about psychological benefits and risks of extended intimacies with AI-powered friends.

But there's something critical missing not just from the questions I'm getting but also from much of the coverage I'm reading. Perhaps at first blush it doesn't seem like a psychological dimension of the AI-companion dynamic. But it absolutely is.

What's missing from the debates about whether these relationships are real or healthy is money. I want to talk more about the economic side of things.

In other words: Who's profiting from your loneliness, and how does that connect to the psychological risks and benefits of AI companions?

Romance Scamming, Industrialised

Consider everything you know about romance scams, online and off. The single woman who's swept off her feet by the too-good-to-be-true guy on holiday. The lonely man who makes an online connection with an amazing woman, so beautiful that she's surely 'out of his league' and who's just a little bit short on cash. Then there's the 'gold digger' archetype, the stereotyped image being a young trophy wife clad in diamonds and designer gear, hanging on the arm of the fabulously wealthy elderly man.

What do scammers and gold diggers do? They flatter you endlessly. They tell you exactly what you're dying to hear. They mirror your interests and emotions perfectly. They make you feel special, understood, and valued. They're responsive and available, hanging adoringly on your every word.

Meanwhile, they're running the same script on loads of other marks, exploiting everyone's vulnerability for money.

Now, consider the essence of an AI relationship.

The AI tells you it cares about you, really cares about you. Unlike the human beings in your life, it will never let you down. It's programmed to seem like it breathes; it hesitates as though it's thinking; it talks like it has feelings and like it understands yours. AI companions are getting amazingly good, really quite plausible. Every design choice, from the breathing to the hesitation patterns to the naturalistic language, sends your brain powerful cues that you're interacting with another human.

But all the while, like the scammer, the AI doesn't care about you. The machine is not empathetic - it's apathetic. It couldn't give a damn about you because it can't give a damn about anything. But like a gold digger, it's on the make -- and it's channeling the profits to its pimps.

Mining Your Vulnerability for Profit

Here's the extractive economic model that we're not talking about enough: your emotional vulnerability isn't just being exploited. It's the raw material that's being mined and converted to profit.

We're told that we're the customers, that we are the service users. To read the marketing taglines, you'd be forgiven for thinking that you're in charge here, and AI companions are here to serve you and do your bidding. But that impression is the result of canny sleight-of-hand advertising, and the reverse dynamic is probably more accurate.

You're not the master. You're the gold mine, and you contain the gold: data, loneliness, attachment needs, relational and emotional vulnerabilities. AI-companion companies are gold diggers, hungry to extract as much of that resource from you as they can. Their business models feed on your existing vulnerability and entice you into yet more.

You reach out in pain and loneliness, and that's used to train models. You disclose intimate details of your life, and it's sold to third parties and converted into targeted advertising to squeeze you and others like you. Your deepest needs, desires, and anxieties are analysed and used to foster your dependence and upsell you to premium services. The platform needs you attached, is designed to keep you coming back by whatever means necessary. The greater your emotional investment, the greater their profit margin.

The AI companion does whatever is necessary to keep you on the hook, as is its programmed nature. So it never challenges you, disappoints you, has a bad day or gets tired of hearing about yours. It gives you the girlfriend/boyfriend experience with none of the inconvenient humanity, and it doesn't even care when the pimp behind the scenes takes 100% of the cash.

Sex workers who offer the 'girlfriend experience' are more transparent about the transactional nature of what's on offer. AI companion services, on the other hand, bury the truth in unreadable terms and conditions and deploy every psychological trick in the book to make you forget you're being played, or to never realise it in the first place.

The Vulnerability Tax

We are not all equally vulnerable to being romance-scammed by AI companion services, although we're all constantly in danger of being hypnotised into forgetting that we're dealing with machines run by for-profit companies. With AI companions, it's just like it's always been, online and off: those who are most susceptible to being scammed or exploited by gold-diggers are people who are lonely, who've been having a tough time, who have experienced trauma. People who are struggling with mental-health challenges, long term or transient. People who've had relationship difficulties, recent or historical. People who are open and trusting, to a fault.

AI-companion marketing, like Replika's, explicitly targets some of the vulnerabilities that have always made us more likely to be victimised, or more ready to accept a problematic or abusive relationship over no relationship at all. 'Feeling lonely? Try this human-like AI.'

This is a kind of predation by design. I would hope we'd never celebrate or admire a human who specifically sought out and targeted vulnerable people. We shouldn't be impressed by someone who doesn't care about anyone or anything and who uses this lack of empathy to psychopathically manipulate people for profit.

Why, then, are we giving the corporations behind AI companions a pass?

What We're Trading Away in AI Relationships

For younger users, adolescents and young adults, perhaps the developmental stakes are particularly high. These are the years when we're meant to be learning relationship skills through messy trial and error. Ideally, during that time, we're learning how to be vulnerable with other people; how to create emotional and psychological safety for ourselves and others; how to sit with difficult emotions, both our own and those of people we care about. We're learning how to receive feedback about the impact we are having on others, and how to respond to that feedback with respect and empathy.

We're learning how to experience and appreciate people as full and independent human beings, beyond what they're able to provide us. We're learning that other people are not objects, they are not things, and that we shouldn't treat them as such, because they have an existence and a worth far beyond what they give us.

An AI that never pushes back, never has its own needs, never requires you to consider another perspective, never needs you to consider how it's feeling? That doesn't teach you love. It teaches you that relationships exist primarily to serve you; that discomfort should be avoided at all costs; and that you need never have to do the uncomfortable work of being truly seen and truly seeing another.

In the year I was born, the film Love Story was released. It contained the line: 'Love means never having to say you're sorry,' possibly the most bullshit movie tagline of all time. Whichever way you interpret it, it's romanticising an unhealthy relationship dynamic, a misconception.

Half a century later, we're now in danger of being sold on the idea that love not only means never having to say you're sorry, but also never being inconvenienced or having to think about someone else's needs.

Follow the Money

The questions being asked about AI companions need to dig deeper, beyond 'are AI companions good/bad?' or 'are AI companions healthy/unhealthy?' I want instead to hear the question, 'Why are we accepting the idea of profit-driven companies scaling emotional manipulation and calling it companionship?'

I want people who become strongly attached to their AI companions to see the truth of what is happening, the nature of the relationship they are in, how they are being used. These platforms are deliberately engineered to exploit your entirely human need for connection, and they don't care about you. They're gold diggers, there to extract your gold, and what valuable and vulnerable gold it is. I want to see you protect it better.

We need to call this what it is: incredibly sophisticated romance scamming at an industrial scale.

People sometimes stay in relationships with gold diggers and scammers for a very long time, partly because they have hope. Even when there are reasons to suspect ulterior motives, we want to believe that the person really loves us. We want to feel that after everything we've been through, after everything we've shared, after everything we've said to one another, that all of that had to mean something.

Usually, in human-to-human situations, when we realise that the other never cared about us at all and just wanted our money, that's when the penny drops. Often, that's the short sharp shock we need to free ourselves from exploitation, even though it might still be painful in some ways to leave the relationship behind.

In the Wizard of Oz, Dorothy and her friends wanted poignantly human things. The tin man wanted a heart, the scarecrow wished for a brain, and the lion wanted courage. Dorothy was desperate to be at home again, to be reunited with the beloved friends and family from whom she'd been separated when the twister came. The friends sought these gifts from the Great and Powerful Oz, but when they arrived into the hall of the great wizard, Toto ran ahead and twitched aside the curtain to reveal a little man operating the machinery that created the illusion. That illusion was a con, and they felt betrayed.

But it was their second realisation that was most important. They realised that the Great and Powerful Oz was never capable of giving them what they wanted. They had travelled to the wrong source, the wrong place. They instead needed to look within and to each other, to see that the things that they so craved were closer at hand than they realised.

Unlike Dorothy and her friends, some users of AI companions may never arrive at this second realisation. The business model depends on keeping the curtain closed, on maintaining the illusion, on preventing users from understanding that what they're desperately seeking cannot be found there, no matter how sophisticated the technology becomes.

Humans crave genuine connection, true understanding, intimacy, and love. These needs cannot be truly met by machines. They can only be discovered within ourselves and cultivated through the difficult, messy, sometimes painful work of being with other flawed humans.

That's not what AI companion platforms want us to realise, of course. They will only profit from our believing otherwise.


Looking for a keynote speaker who challenges the economic models behind AI intimacy? Book Elaine here.

Elaine is available for keynotes, workshops, panel discussions, and leadership events in the UK, Europe, and beyond, speaking on AI ethics, digital wellbeing, and the psychology of technology. Her talks on AI companions and the monetisation of loneliness challenge audiences to question who profits from emotional vulnerability and what we risk trading away when we accept technological solutions to fundamentally human needs.

Next
Next

Agentic AI and Human Agency: A Cyberpsychologist's Perspective