Ever been in highschool, nervous about breaking up with someone you’ve only been seeing for a couple of months? You might call a friend for help to craft the perfect breakup text. The two of you sit on your bed, laughing and cringing at each draft until you finally settle on a message that captures the classic “it’s not you, it’s me” sentiment you’re trying to convey.

Today, an AI chatbot called “AI4Chat Break Up Text Generator,” marketed as a personal assistant that helps people navigate difficult conversations, can do that for you. In addition to helping you break up with someone, AI can help you sext and create an AI girlfriend companion that you might one day want to propose to (turns out the movie Her could have been a documentary if they waited a few years).

In the 21st century, the use of AI to manage any aspect of one’s love life is becoming a popular option. Some reports find that growing numbers of Gen Zers (especially 18-26 year olds) are relying heavily on AI enabled apps to find love, plan dates, and get relationship advice.

Advertisement

In the Philippines, where mixed signals often dominate conversations (a not so secret problem many know all too well), it’s also not surprising to see many yearn for something clearer. 

Marshaley Baquiano, a licensed psychologist and an Associate Professor of Psychology at the University of Guam, attributes this phenomenon to the cultures that have long shaped the Philippines. 

“We want to always be there because we’re from a collectivist culture,” she says. “As people coming from a collectivist culture, we see ourselves as not independent beings, but interdependent individuals. And so we see ourselves as members of a group. So I think that is why we seek that connection. And AI helps us address that need.”

Advertisement

For some, these tools aren’t only useful for the beginning or ending of relationships. Ishani and her partner of 10 years are part of this new generation of couples using AI to improve their existing relationship.

“We have never gone for couples therapy but we tried Paired before exclusively moving over to Agapé,” Ishani says. Paired and Agapé are both apps meant to be used by couples seeking to improve their relationships: Agape focuses on having couples reflect on one question together everyday; Paired is more of a coaching tool which includes quizzes, and relationship advice from experts.

The 25-year-old from Canada was in search of a way to stay connected with her long distance partner when she came across these apps on Instagram. “I love words of affirmation but my partner is avoidant and giving compliments doesn’t come easily to him,” Ishani says. “These apps help us bridge that gap and communicate better.” Plus, since they’re long distance, it gives them something to talk about on the phone.

Advertisement

Distance is also a factor in seeking more, as Dr. Baquiano says. “When you’re away from your family, when you’re away from the person that you’re in a romantic relationship with, the thing with AI is that it’s very accessible. It’s accessible 24/7. So it’s there. The support is there whenever you need it. It’s just at the tip of your fingertips.”

In a typical user journey, both couples need to download the Agapé mobile app to sync with each other’s profiles after which they’re automatically assigned one question a day. 

“Life gets busy, but we try to answer the question by the end of the day,” Ishani says. “We usually end up talking about [our answers] when we call the other person to say goodnight.” If they don’t have time to chat it out over the phone, the couple can answer the questions in the app and leave notes in each other’s comment sections. The ultimate goal of the app, Agapé says on their website, is for couples to ”both feel and show love.”

Advertisement

With regular user input, the company says the questions become more specific to each couple, thanks to a “complex machine learning algorithm.” Those personalized questions are a big reason Ishani continues to use Agapé. When they don’t want to answer the daily assigned question, they can pick from a different “deck” related to communication, family, and finances.

Then there are other apps, like Arya, that have created an AI powered intimacy concierge to help couples improve their sexual intimacy. The founder of this app, Offer Yehudai, is a serial entrepreneur with a background in advertising and technology who says he saw a massive gap in the market when it came to relationships and intimacy.

“Americans are spending billions investing in their personal wellness. But when I looked for something to help couples invest in their relationships? Nothing.” Offer says. “That’s when I knew we had to create Arya—to bring that same tech-enabled approach to intimacy.”

Offer explains that when a user downloads Arya onto their phone, the AI powered “intimacy concierge” has you answer a couple of questions before categorizing people into one of four “Erotic Personas.”

This set of personas was developed by an in-house research team, which is led by Nicholas Velotta, a PhD student at the University of Washington and Pepper Schwartz, PhD, a professor of sociology at University of Washington, after interviewing more than 50,000 couples.

The app’s AI intimacy concierge is trained to curate experiences based on these personas, providing options for virtual experiences through the form of guided techniques, audio erotica experiences and aftercare meditations, as well in person experiences through discreetly delivered packages containing various toys, bondage related materials, and intimacy exercises.

To be clear, none of the apps in this story should be taken as a replacement for couples’ therapy—and none of them claim to be. Rather, they’re marketed as complementary AI chat bot tools developed with the help of human relationship therapists.

“AI also gives personalized feedback. AI can also give tips to promote meaningful connections. It also has guided conversation prompts,” explains Dr. Baquiano.  

Its accessibility, however, comes with an inevitable con. “AI is looking at the prompts that are being given to it when it’s being used. So AI still doesn’t know your background as a person. AI doesn’t know the flaws of a person or the history of your dynamics or your own romantic history. So it still doesn’t have the capacity to understand complex dynamics,” Dr. Baquiano adds. 

For example, when conversations with the Arya AI intimacy concierge start to become more emotionally charged (for example, if the user says “why am I feeling so disconnected lately” or “how do I talk to my partner about this”) the user is immediately transferred to their concierge team.

The team is made up of five on-call certified sex therapists, sex educators, relationship psychologists, and relationship scientists, with oversight from Shan Boodram and Prof. Pepper Schwartz. Offer explains that once members sign up, they’re assigned their own concierge who reaches out within a day. He goes on to clarify that communication only happens over text or web-app, not video, noting that they’ve “found that’s the preferred and easiest way members like to interact.”

Israa Nasir, a licensed therapist and author of Toxic Productivity, showcases some of the easy ways we can use AI to be more productive when it comes to our relationships. “AI apps like this can help people learn emotional language and increase emotional literacy as well as [help users identify] topics to bring to your IRL therapist to discuss.”

But while she sees the upside, she shares some concerns about the objectivity of AI. “AI will only provide information based on what you feed it, and there is a risk that the model you’re talking to may not be as attuned to differences in the human experience, because of inherent biases in the way the AI model is built,” she adds.

When asked if AI can replace therapy, Shadeen Francis, a licensed marriage and family therapist, says it’s unlikely—plus, she worries about the harm certain AI apps can do by encouraging people to isolate and disconnect even more. 

“Therapists also get to know their clients, track patterns of behavior, and support them in reaching their goals,” Francis says. “However, part of their role is also to compassionately challenge their clients and help them grow.” Francis goes on to add that many AI models learn from every user interaction, which is then used to make up new data with the goal of pleasing the user, which is a big ethical concern.

Users are free to share whatever they want with AI, a glaring issue that Dr. Baquiano also stresses. “Because it’s just repeating what we’re saying, that may, in very subtle means, validate dysfunctional patterns. Especially depending on the bias that we’re giving to AI. So AI, because it’s repeating what we’re saying, that can reinforce distorted narratives,” she explains.

Even though many apps aren’t directly marketed as a therapy replacement (though there are exceptions, like Abby, which is actually meant to be an AI therapist), people are using them to replace visits with a human therapist. There are several videos promoting Entries AI, an AI journal to help people process their feelings, and suggesting followers can use the tool “if they can’t afford therapy.” 

A subreddit called r/therapyGPT has 11,000 members, and there are plenty of videos training people how to use the LLM for therapy too.

“As humans, we need to be able to organize our thoughts ourselves and be able to articulate our needs ourselves—we cannot outsource thinking to AI when it comes to navigating relationships or conflict.”

An innocent text can be easily decoded by these tools, especially if you’ve gone too far into the rabbit hole. ChatGPT, for one, has been an accompaniment for many users as seen in their testimonies. They’d simply send a screenshot of the text, alongside the context and background of how that text came to be. And it’s all up to ChatGPT to explain what this text meant, no matter how absurd it may seem. If they’re not satisfied with its answer, they’ll casually order it to be as “blunt and honest as possible,” in an attempt to safeguard their own feelings.

There is also growing concern that AI can decrease a person’s emotional intelligence. Nasir refers to this as “Skill Erosion,” and it certainly applies to being able to maintain a healthy romantic relationship.

“As humans, we need to be able to organize our thoughts ourselves and be able to articulate our needs ourselves—we cannot outsource thinking to AI when it comes to navigating relationships or conflict,” she says. “Always having AI to think for us prevents us from accessing self soothing and building the skill to problem solve, process difficult emotions, or make sense of our experiences.”

Letting AI solve these problems, “could lead to avoidant behaviors. By doing that, the person may be actually avoiding facing their true feelings,” Dr. Baquiano stresses. Through the use of AI, it warrants users to avoid confronting their own feelings. “So that means that deep down we’re thinking that I didn’t do this. It was AI who said this. So it’s like we’re sidestepping the discomfort of actually doing the breakup,” she adds.

When asked how Arya’s AI models mitigate the potential of skill erosion and other biases, the founder, Offer, notes that modelling AI is a work in progress, even when it comes to their app. “Emotional intelligence isn’t something we want to automate so we’re continually testing and refining our models with feedback from diverse members to make sure we’re not just building for efficiency, but for empathy,” he says.

All of that said, using an app to generate a conversation starter once in a while probably doesn’t spell the end of humanity. In addition to seeing a human therapist once a week to process some of her own feelings about the relationship, Ishani and her partner have found that some of the prompts on the app allow them to face difficult conversations (especially when living apart from one another.) “When we meet in person, we analyze our respective answers to some of the topics that one of us may not be comfortable bringing up over the phone,” she says.

If you’re in the AI-curious set, here are a few tips on how to use these types of relationship apps responsibly.

Ask yourself why you want to use AI

This technology is not meant to understand or decipher human emotions. It lacks the emotional intelligence or empathy that a real human therapist is supposed to have. So when using these apps, be cautious about why you’re using it––Israa encourages couples to use it as a tool for generating curiosity and connection, not as a way to validate your negative feelings about your partner.

Do your research

Opt for apps that share information on how their AI models are trained. Although the EU AI act offers a framework for what constitutes responsible AI use in healthcare, its guidelines still need to be adapted to the Philippine healthcare system which doesn’t have its own policies when it comes to AI use in health. The US Department of Health and Human Services on the other hand, has guidelines on what constitutes a well-trained health-related AI: The ideal model is one that is trained with peer reviewed research along with constant oversight from a team of humans. An unsupervised generative model that exclusively relies on content from you, the user, can create an echo chamber for its output. Most companies share their research and how they develop their proprietary AI models—if they don’t, that’s definitely a red flag.

Go for hybrid apps

Especially when it comes to relationships and sex, think about using an app that shares the advice of a real human therapist with an actual degree and license. And remember: Anyone can claim to be anything online. Even if a website says someone is a licensed family therapist, you should confirm their qualifications by looking up their name or PRC license number.

Beware of sharing too much

As Shahdeen notes, “each platform has independent practices for data storage and privacy, so security of data becomes more complicated.” And while many apps claim they’re doing the best they can to protect your data, leaks can happen and any app that asks you to share personal details about your life is a red flag so be wary of oversharing. As of the moment, the Philippine Congress has proposed House Bill 3480 which seeks to formalize responsible and ethical use of AI in the Philippines. The National Privacy Commission on the other hand is looking to apply the Data Privacy Act to AI systems. None of these proposed developments however are targeted towards use of AI in healthcare such as mental health. The Philippines still has a long way to go in terms of properly and ethically integrating AI into its healthcare systems.

With AI’s breakthrough in almost every field, including health, there is no question that AI is here to stay. But, as Dr. Baquiano stresses, users must still be inherently responsible for its effects. “It is important that we know how we can make sure that AI tools in romance are guided by ethical considerations, prioritizing well-being and privacy,” says Dr. Baquiano. “We see these tools as helpful aids for positive freedom, not as a new source of dependence. AI influences our human emotional communication. It’s important that we be cautious of its potential long-term implications,” she adds.

Apps can be a great way to plan a date if you’re feeling decision burnout, but it’s in your best interest to have difficult conversations with your partner yourself without the help of a teleprompter. Because ultimately, intimacy grows in the presence of shared discomfort and from being vulnerable in the presence of our loved ones. When it comes to your relationships, decide what you want to do yourself and what you want to outsource to AI.

“If we turn to AI every time we have a problem in our relationship, maybe we won’t listen to our own intuition. We might not anymore listen to our sense of relational self. If love is reduced to a commodity because we use AI as a romantic commodity, maybe the source of meaning of love will be lost,” Dr. Baquiano says.

The next time you’re facing a tough relationship moment, instead of turning to an AI bot, consider calling a friend you trust to give you some advice. As cheesy as this sounds, nothing beats the human touch.

Originally published by Allure US

More like this: