Catholic experts say new AI ‘Friend’ device undermines real relationships


A commuter waits at the Westchester/Veterans Metro K Line station on Thursday, Oct. 2, 2025, in Los Angeles. / Credit: Carlin Stiehl/Los Angeles Times via Getty Images

Washington, D.C. Newsroom, Oct 21, 2025 / 07:00 am (CNA).

A controversial ad campaign posted in the New York City subway system has sparked criticism and vandalism over the past few weeks. The print ads are selling an AI companion necklace called “Friend” that promises to be “someone who listens, responds, and supports.”

The device first launched in 2024, retailing at $129. It is designed to listen to conversations, process the information, and send responses to the user’s phone via a connected app. While users can tap the disc’s button to prompt an immediate response, the product will also send unprompted texts. The device’s microphones don’t offer an off switch, so it is constantly listening and sending messages based on conversations it picks up.

CNA did not receive a response to a question from Friend.com about the success of its subway ad campaign and how many people are currently using the devices, but Sister Nancy Usselmann, FSP, director of the Daughters of St. Paul’s Pauline Media Studies who also studies AI, told CNA that “people are turning to AI for companionship because they find human relationships too complicated.” 

But “without that complicatedness, we cannot grow to become the best that we can be. We remain stagnant or selfish, which is a miserable existence,” she said.

Creating ‘Friend’ amid loneliness epidemic 

Avi Schiffmann, the 22-year-old who started Friend.com, was a Harvard student before leaving school to focus on a number of projects. At 18, he created a website that tracked early COVID-19 data from Chinese health department sources. In 2022, he built another website that matched Ukrainian refugees with hosts around the world to help them find places to stay. He then founded Friend and now serves as the company’s CEO. 

Schiffmann and his company first turned heads when an eerie video announcing the new gadget was released in July 2024. The advertisement featured four different individuals interacting with their “friends.” One woman takes a hike with her pendant, while another watches a movie with hers. A man gets a text from his “friend” while playing video games with his human friends. He first appears to be sad and lonely around his friends, until his AI “friend” texts him, which appears to put him at ease.

The marketing video ends with a young man and woman spending time together as the woman discusses how she has only ever brought “her” to where they are hanging out, referencing her AI gadget. 

“It’s so strange because it’s awkward to have an AI in between a human friendship,” Usselmann said about the video ad. 

Hundreds took to the comments section of the YouTube video to respond — mostly negatively — to both the Friend.com ads and the technology. Commentators called out the company for capitalizing on loneliness and depression. One user even called the video “the most dystopian advertisement” he had ever seen, and others wrote the video felt like a “horror film.”

“While its creators might have good intentions to bring more people the joys of companionship, they are misguided in trying to achieve this through a digital simulacrum,” Father Michael Baggot, LC, professor of bioethics at the Pontifical Athenaeum Regina Apostolorum in Rome, told CNA.

The device “suffers from a misnomer, since authentic friendship involves an interpersonal relationship of mutual support,” said Baggot, who studies AI chatbots and works on the development of the Catholic AI platform Magisterium AI

“The product risks both worsening the loneliness epidemic by isolating users from others and undermining genuine solitude by intruding on quiet moments with constant notifications and surveillance. Friend commodities connection and may exploit human emotional vulnerabilities for profit,” he said, adding: “It might encourage users to avoid the challenging task of building real relationships with people and encourage them to settle for the easily controllable substitute.” 

Usselmann agreed. “Only by reaching out in genuine compassion and care can another person who feels lonely realize that they matter to someone else,” she said. “We need to get to know our neighbors and not remain so self-centered in our apartments, neighborhoods, communities, or places of work.”

AI device ad campaign causes stir

In a post to social media platform X on Sept. 25, Schiffmann announced the launch of the subway ad campaign. The post has more than 25 million views and nearly 1,000 comments criticizing the pendant and campaign — and some commending them.

Dozens of the ads have since been torn up and written on. People have posted images to social media of the vandalized ads with messages about the surveillance dangers and the general threats of chatbots. One urged the company to “stop profiting off of loneliness,” while another had “AI is not your friend” written on it.

One person added to the definition of “friend,” writing it is also a “living being.” It also had the message: “Don’t use AI to cure your loneliness. Reach out into the world!”

Usselmann said the particular issue with the campaign and device is “the tech world assuming certain words and giving them different connotations.” 

“A ‘friend’ is someone with whom you have a bond based on mutual affection,” she said. “A machine does not have real affection because it cannot love. It does not have a spiritual soul from which intellect, moral agency, and love stem.”

She continued: “And from a Christian understanding, a friend is someone who exhibits sacrificial love, who supports through the ups and downs of life, and who offers spiritual encouragement and forgiveness. An AI ‘friend’ can do none of those things.”

Read original article

Be the first to comment

Leave a Reply