It’s the time of year for exchanging presents, spinning dreidels, singing carols, and lighting candles. However you may celebrate this holiday season, one image prevails: spending quality time with your loved ones.
Unfortunately for some, this isn’t the case; this season is often the loneliest time of year.
In fact, the pervasive idea that the winter holidays should be spent with family, friends, and community leads many to feel the opposite of holiday cheer. One survey found that 61% of Americans expected to experience loneliness or sadness during the holiday season; with 37% of Americans saying they would skip the holidays altogether if they could.
Whether it’s the rising cost of living preventing individuals from traveling to be with their loved ones or unmet expectations of joyful reunions, for many this is a very lonely time.
While felt more acutely during the end of year holidays, loneliness has been on the rise for years. In 2023, it was declared an epidemic by then US Surgeon General Vivek Murthy.
And yet, while one Harvard survey found that 73% of respondents cited technology as a contributor to loneliness in the US, many are now turning to technology as a solution.
Generative AI chatbots like ChatGPT, Gemini, and Claude can provide information in conversational format; while ‘AI companions’ —generative AI applications— are specifically designed to provide emotional support, human-like connection, and even romantic relationships. Generative AI tools like these are increasingly taking on the role of human companions.
In this current age of AI, does access to AI companions mean the end of holiday-related loneliness?
Are AI Companions a Gift?
Search the App Store for ‘AI companion’ and you’ll find dozens of options. Some of the biggest, like Character.AI and Replika boast upwards of 20 million active users per week. According to Claire Leibowicz, Director of Trust and Society at Partnership on AI, “today’s tools are more dynamic, emotionally evocative, sycophantic, personalized, persuasive, and interactive, making them seem genuinely ‘personlike’ to users.”
Similar to social media platforms, AI companions are designed to maximize user engagement and offer “appealing features like indefinite attention, patience and empathy,” noted a blog post from Ada Lovelace Institute.
While these apps are marketed as a solution for loneliness, these claims are anecdotal at best. Researchers from Harvard Business School, Bilkent University, and UPenn’s Wharton School recently conducted multiple studies to assess if and how AI companions could alleviate loneliness and found that the apps did improve loneliness by making users ‘feel heard’, on par with interacting with another person.
It’s important to note that the apps used in the studies were set up to be “caring and friendly, inducing the sense of being heard” and that other apps “might not produce similar loneliness alleviation benefits.” Importantly, apps that are not specifically designed for companion purposes, including general knowledge chatbots like ChatGPT, may not be as effective in reducing users’ loneliness. And in fact, may lead to more risks.
…Or a Big Lump of Coal?
The risks of AI companions include technological issues such as weak personal data protections and security standards, which could lead to data privacy breaches or ransomware attacks.
The bigger concern however are the impacts on users mental and emotional wellbeing.
“To the teenager chatting daily with Character.AI’s virtual companions or the elderly person asking Amazon’s Alexa questions throughout the day, AI transcends its purely technological status, simultaneously affecting how people socialize and consume knowledge,” said Leibowicz. Because the technology is new, the broader implications on human and social connection are still being investigated, including by Partnership on AI in work that Leibowicz is leading.
Many users are using AI companions for more than friendly conversation, relying on these apps for emotional support and even as mental health therapists. This has led to reports of ‘AI psychosis’ in which AI has “amplified, validated, or even co-created psychotic symptoms with individuals.”
Some of the most concerning risks of AI companions are its impacts on youth. According to one study on AI use in high schools, 42% of students said they or someone they know had used AI for companionship.
Parents and online safety advocates have raised the alarm on the risks and harms of AI chatbots on children and youth mental health. This includes tragic harms such as teenagers taking their own lives after sharing suicidal ideations with chatbots rather than seeking help from parents or mental health professionals. Bereaved parents and advocates have called on Congress to regulate these apps to ensure the safety and well-being of the youth.
Proceed, With Caution
We hope that December brings you holiday fun, but if you or someone you know considers this the loneliest time of year and wants to turn to AI as a solution, consider these three things:
- Remember that AI companions are no replacement for real human connection. While they may generate responses that are friendly or encouraging, AI companions are computer programs. They have as much personality and feeling as the apps you use to do your taxes or shop online–which is to say none.
- Look to digital tools beyond AI to help find connection and community. Online forums exist for nearly every hobby or interest, and joining an online community can lead to offline events. It’s likely that your local community center or library also has a digital presence to share classes, events, and groups to join.
- Seek professional mental health support. In some cases, due to cultural or social stigma, it may feel easier to confess mental health issues to a computer than to a person. In the US, if you or someone you know is experiencing a mental health crisis, reach out immediately to the 988 Suicide & Crisis Lifeline by calling or texting 988 for free, confidential 24/7 support.
While AI companions may seem like a shiny present for those experiencing loneliness this holiday season, remember that as human-like as these apps seem, they are no replacement for real human-to-human connection.