


Single New Yorker Kristen Ruby recently got a text message from a guy she’s known for about two years that didn’t quite sound like him.
“I was surprised and hurt to hear you refer to me as a narcissist … I would appreciate it if you could share with me what behaviors or actions you have observed that led you to that conclusion,” the text read in part.
It was oddly empathetic, she thought, not what she’d come to expect from the man, who could be a bit callous.
She was right to notice something was off. It was actually a bot.
“What was so scary about it was that the message that has [emotion] is the one that’s going to be more convincing or compelling to a woman on the other end of it,” Ruby, 36, a public relations executive from Westchester County who specializes in social media trends and artificial intelligence, told The Post.
Desperate men are turning to bots to do online dating for them — and ensnaring unsuspecting women in the process.
The phenomenon was even echoed on a recent episode of “South Park” that had been co-written by ChatGPT and depicted the show’s main characters using the extremely popular AI tool to communicate with their girlfriends.
Now, some men are using AI to do the grunt work of scanning apps such as Tinder, Bumble and Hinge for matches and engaging in direct-message chats with women.
“Dating apps have always favored women, so I decided to tip the scales … I fought back by building an AI-powered bot that could do the swiping and chatting for me,” said a creator of AI dating program CupidBot, in a since-deleted Reddit post that had readers debating the ethics of men using an AI doppelgänger to score dates.
“To answer whether it’s ethical: No,” tweeted one critic. “People, potential dating partners or not, are not statistics. They’re very real people, who I assume are not aware they are actually talking to a robot.”

Others voiced doubts that that the AI dating program even existed.
But a spokesperson for CupidBot — which claims to have been founded by former Tinder employees — told The Post by email that the Reddit thread was published by one of its engineers, adding that the software earned the unnamed man 13 dates in one month.
On its website, CupidBot promises users will “get several dates a week by doing absolutely nothing” — even doing the scheduling and soliciting phone numbers from the women.
The AI tool, currently in public beta mode, costs $15 to access, though that price tag is expected to increase once the product is fully launched.

As an initial step, users are asked to describe their dream date to the AI — be it a sweet ice cream get-together, an intimate museum tour or just a light-hearted chat over the phone.
CupidBot takes over from there.
“It automatically swipes on people that are congruent with a user’s preferences,” said the representative. “It uses transfer learning with MobileNetV3 to discern a user’s ‘type’ based on all their previous matches and swiping behaviors.”
The bot also uses GPT-4 — an updated iteration of OpenAI’s ChatGPT that’s trained to generate complex responses to prompts — to converse with matches via the dating app’s direct messaging feature.


According to the rep, the user can select one of 7 “tones” for their bot — including personas such as “Nice Guy,” “Rich Guy,” “Nonchalant Guy” and “Witty Guy.”
Once a date and phone number are secured, the bot will send an SMS message with contact information and details of the conversation.
However, the lovestruck woman on the receiving end of the messages has no clue she’s been courted by a computer.
“The bot does not reveal itself. However, we do strongly advise our users to tell the women once they’ve gotten their contact information,” said CupidBot, claiming data has found that the initial exchanges between folks on dating apps aren’t typically memorable and don’t affect the success of in-person dates.
Company reps did, however, insist that the software’s objective is to make the world of cyberdating more about high quality matches and less about money.
“Our goal is not to saturate [dating apps] with artificial conversations, and it’s certainly not to objectify women,” said the spokesperson. “But rather to force dating apps to reevaluate how they operate and to facilitate the dating process for some people in the meantime.”
But women like Ruby are concerned that robo-dating systems could create a false sense of chemistry.
“There are [so many] dangers of becoming emotionally attached to a person who is actually a bot,” said Ruby, noting the potential of falling for computer-generated words or “feelings” that the actual man can’t or won’t replicate in real life.
“The scariest part is, because it’s such a new trend, a lot of women don’t even know that their private messages are being copied and pasted into an AI system without their consent,” she added.
“Over time, that information can be used to create a psychological profile [of her].”