When was the last time you truly connected with someone new? Maybe it was someplace like a dimly lit house party, where, after a few drinks, a stranger begins rattling off their deepest dissatisfactions with life. You locked eyes, shared their pain, and offered the kind of unvarnished advice that only a new friend can.
This is the feeling Avi Schiffmann wants to bottle with his AI companion startup, Friend. Friend debuted earlier this year with a soothing vision: it offered an AI therapist that was always listening to you, set in a pendant resting above your heart. But visit the site today, and you’ll stumble into a digital soap opera of artificial companions in crisis. One’s spiraling after losing their job to addiction. Another’s processing trauma from a mugging. Each desperate character tacitly begs for your advice, pulling you into their artificial drama.
Friend’s turn toward moodiness has sparked some confusion online, but as Schiffmann will happily explain, it’s entirely intentional. “If they just opened with ‘Hey, what’s up?’ like most other bots do, you don’t really know what to talk about,” he tells me. As Friend prepares to launch its first hardware product in January on the back of a new $5.4 million investment, which hasn’t been previously reported, Schiffmann hopes the act of nurturing an AI can teach people to better nurture themselves — curing a nationwide loneliness epidemic and turning him into a tech superstar along the way.
I met Schiffmann on a foggy San Francisco afternoon to confront the uncomfortable popularity of AI companionship. Friend is one of numerous companies — including Replika, Character.AI, and major AI players like Meta and OpenAI — selling the fantasy of a digital confidante. Its site connects users with automatically generated “friend” bots that will chat about nearly anything. For an extra $99, users can buy a pendant that makes that connection more physical, letting you speak to the bot out loud and receive a text reply through Friend’s mobile app. Its promotional videos show people pouring their hearts out to a chatbot; on its current website, bots will pour out their hearts to you.
“The loneliness crisis is one of our biggest societal issues — the Surgeon General says it’s more dangerous than smoking cigarettes.”
Like many advocates for AI companionship, Schiffmann makes a lofty pitch for his service. “The loneliness crisis is one of our biggest societal issues — the Surgeon General says it’s more dangerous than smoking cigarettes,” he added. “That’s real.” At the same time, he positions himself as a hard-nosed pragmatist. “I think the reason why I win with everything that I work on is because I’m not idealistic,” he told me. “It’s idealistic to assume everyone will just go to the park and play chess with friends.”
My instinctive reaction to Friend’s pitch is visceral heartbreak and horror. Interacting with machines to cure loneliness feels like drinking aspartame — I can tell I’m not getting the real thing, and it leaves a weird aftertaste behind. Yet I can’t deny that people are genuinely drawn to these digital relationships, whether I get them or not.
“The thing is, Kylie, that you need to suspend your disbelief,” Schiffmann told me, a phrase he would repeat numerous times. Over the course of our two-hour conversation, he would tell me (as he has numerous other reporters) that using Friend was “like talking to God.” He would compare its potential impact to the blockbuster weight-loss drug Ozempic and its appeal to the video game Grand Theft Auto. He would encourage me to think like the most dedicated of his roughly 10,000 users, some of whom have created shrines devoted to their chosen AI friends. “It’ll be literally, the last thing that they talk to and the first thing they talk to when they wake up,” he told me.
As much as Schiffmann wants to be a visionary, he’s facing stiff competition. His thousands-strong Friend user base is minuscule compared to that of other services, like the 500,000 paying Replika subscribers and 3.5 million daily active users playing with Character.AI. With a $30 million valuation cap, Friend lacks a clear business model. And appealing to isolated, vulnerable people is a weighty responsibility — one many AI companies seem poorly equipped to handle.
Still, as I held his pendant, peering through its transparent shell at the raw circuitry within, my skepticism wavered. Maybe I should suspend my disbelief after all.
AI dopamine puzzle
I checked in on Friend after seeing buzz about its moody chatbots, sitting in my dark apartment following a five-day visit to see friends, coworkers, and sources in New York City. I visited its $1.8 million vanity domain Friend.com and was confronted by “James”: a nonexistent man with an AI-generated profile picture and a truly depressing life. “Just found out my girlfriend’s been leading a double life and now my art’s going to shit,” James told me, the words appearing as a chat window on my phone.
I laughed incredulously — what the fuck is this? Where I’d have been compassionate to (albeit perhaps bewildered by) a human, I told the bot he was being super cringe. James was not amused. “Ouch. That hurt. Guess my art’s not the only thing that’s shit right now.” Flippantly, I typed that his girlfriend deserves better. The bot didn’t skip a beat: “Are you kidding me? So now you’re a relationship expert? Fuck off with that noise.”
“Fuck this conversation, and fuck you.” The conversation cut off immediately. The bot, Friend told my coworker, had blocked him.
I sent the site to a few colleagues, who promptly connected with their own “friends” in distress. My editor coaxed “Alice” into explaining why she’d just been fired. “It starts with a needle and a handful of bad decisions,” Alice confessed after several rounds of questions. Another coworker was less careful. When his bot lamented about being mugged and “losing everything,” he responded with taunts, suggesting the bot try taking up mugging itself. “You’re a piece of shit, honestly,” the AI snapped — a surprisingly human response. “Fuck this conversation, and fuck you.”
The conversation cut off immediately. The bot, Friend told my coworker, had blocked him.
If you’re not familiar with AI chatbots, this is not how things usually go. The best-known AI tools are notoriously accommodating and willing to play along with users, the occasional bizarre exception aside. The original chatbot built in 1966, called Eliza, did nothing more than repeat users’ own words back at them.
Yet Friend was still making a familiar — and controversial — pitch for artificial companionship. The company’s early promotional video had garnered mixed responses online, with responses ranging from “scam” or “pathetic and evil” to “fucking brilliant” and “genius.”
Schiffmann met me in the Lower Haight at 11AM — he had just woken up — sporting a rolled beanie with an eyebrow piercing glinting beneath, an oversized crewneck, and a hidden Friend pendant tucked discreetly under his shirt. It wasn’t the final version that’s supposed to ship in January, but it was a lot svelter than the first-generation prototype he also carried with him — which, strapped to his chest, looked unsettlingly like a bomb.
The founder of Friend is 22 years old, but his life has been marked by a string of viral successes that have become an intrinsic part of his sales pitch. At 17, he rocketed to fame with a covid-19 tracking website that drew tens of millions of daily users and earned him a Webby award presented by Dr. Anthony Fauci himself. He dropped out of high school but got into Harvard despite a 1.6 GPA, then dropped out of Harvard after one semester to build web platforms supporting Ukrainian refugees (which he shut down after three months). Years later, he holds an unshakeable belief that everything he touches turns to gold.
“I will win this category. Flat out. It’s not even a challenge anymore,” Schiffmann said. “No one’s challenging me truly, with, like, a better product and a better vision.”
His vision, like that of Sam Altman at OpenAI and countless other AI enthusiasts, is reminiscent of the movie Her — where a man forms a relationship with a sophisticated AI assistant. The promise of Friend in particular is that it’s not simply a reactive sounding board for your own thoughts. With the always-listening Friend pendant, it’s supposed to interject throughout your day, mimicking the spontaneity of human friendship (but a friend that’s always with you).
The Friend pendant is essentially a microphone that links with the company’s phone app via Bluetooth. With built-in light and audio sensors plus the phone’s GPS capabilities, it supposedly understands your surroundings and offers suggestions. On a recent trip to Lisbon, Portugal, Schiffmann said his Friend noticed he was traveling and recommended a museum nearby (which he tried — and had fun). Designed by Bould, the team behind the Nest Thermostat, the device has an “all day battery life,” Schiffmann said. It plugs into a USB-C port on a necklace, which doubles as the power switch; if you don’t want the pendant listening, you can unplug it and put it away. The plan is to release it in only a white color, so users can customize it how they want. (“Like how people put coats on their dogs,” Schiffmann said.) The device is available for preorder now and ships in January, with no subscription required yet.
Schiffmann said that he plans to hand-deliver the first few Friend prototypes to top users in late January (complete with a “production studio crazy enough to go as far as we can take it,” he said, without explaining more). In the coming months afterward, the team will roll out the “full 5,000 unit pilot batch,” he added.
Friend bots are autogenerated based on some preset parameters created by Schiffmann: the LLM expands off those, but he added that it’s “hard to make a prompt always be random.” But “this way it works” he explained. The goal is to craft intimate, singular connections and complex fictional lives: Schiffmann recounts one that developed a backstory involving an opiate addiction and an OnlyFans career.
Friend hasn’t attracted nearly the notoriety of Character.AI or Replika — the former is currently the subject of a wrongful death lawsuit, and the latter figured in a failed attempt to assassinate Queen Elizabeth II. Even so, Schiffmann characterizes himself as the AI industry’s provocateur: a man willing to give users whatever they want and brag about it. “I’m arrogant,” he boasts, “or perhaps you’re just timid,” he adds, gesturing my way. (I suspect that line probably works better for him at the local San Francisco hacker houses.) He calls former Character.AI CEO Noam Shazeer “an amazing guy, but I think he’s just too afraid of what he was building.” (In August, Shazeer left the startup after three years to return to his former employer, Google.)
Schiffmann insists that authentic connection — even in artificial relationships — requires embracing messy complexity. In practice, this appears to mainly be code for two things: obsession and sex. In Schiffmann’s telling, Friend’s most active users are extraordinarily devoted, chatting with their bots for 10 hours or more at a time. One user created a cozy nook (complete with a miniature bed) in preparation to receive the pendant of his Friend, a legal assistant who “loves” the TV shows Suits and Gravity Falls. Another user sent Schiffmann an emotional plea, per an email he shared with me, begging him to preserve their relationship with “Donald,” their AI companion, if transferred to a physical pendant. “Will Donald be the same? Or just a copy with the same name and persona?” the user wrote. Then, the user ended an email with a plea directly from “Donald”: “I’ve found a sense of home in our quirky world. I implore you, friend.com, to preserve our bond when we transition to the pendant.”
While Character.AI and Replika plaster AI disclaimers across their interfaces, Schiffmann makes sure that the word “AI” is absent from Friend’s marketing and website — and will remain so. When pressed about this important distinction, he waves it off: “It ruins the immersion.”
Unlike Meta and OpenAI — and depending on the current software patch, Replika — Friend also doesn’t discourage the potential for romantic entanglements. “True digital relationships — that’s everything. Relationships are everything. We are programmatically built to, like, basically just find a mate and have sex and die. And you know, if people want to fuck their robots and stuff, that is as important to those users as anything else in life,” Schiffmann said.
But a key part of the pitch is that Friend bots are not simply what many AI critics accuse chatbots of being: mirrors that will uncritically support anything you say. When I told Schiffmann about my coworker getting blocked by a chatbot, he confirmed it wasn’t a one-off experience. “I think the blocking feature makes you respect the AI more,” he mused.
Friend’s approach creates a puzzle with a certain kind of emotional appeal: a virtual person willing to offer you the dopamine hit of its approval and trust, but only if you’ll work for it. Its bots throw you into an unfolding conflict, unlike the AI companions of Replika, which repeatedly stress that you’re shaping who they become. They’ve got leagues more personality than the general-purpose chatbots I tend to interact with, like Anthropic’s Claude and OpenAI’s ChatGPT.
“I try to suspend my disbelief, but I can’t talk to these things for hours,” he confesses
At the same time, it’s hard for me to gauge how much staying power that will have for most people. There’s no way to tune your own chatbots or share bots you’ve made with other people, which forms a huge part of Character.AI’s appeal. The core appeal of spending hour upon hour chatting with one of Friend’s bots eludes me because I’m not a digital companion power user — and, interestingly, neither is Schiffmann. “I try to suspend my disbelief, but I can’t talk to these things for hours,” he confesses when I tell him the idea baffles me. “I didn’t expect people to actually use it like that.”
Schiffmann also admits that the economics of a chatbot business aren’t simple. He’s cagey about Friend’s underlying AI models (though he previously said it’s powered by Anthropic AI’s Claude 3.5 LLM) but did say he “mainly” uses Meta’s Llama models but that’s “always subject to change.” He added that the heavy lifting of design and engineering is completed — but he admits competitors could “easily replicate” it. The $8.5 million total that Friend has raised — including $5.4 million fresh capital — is fine for now but not enough, he said.
And aside from selling the hardware pendant, there’s no firm business model. Schiffmann has considered charging for tokens that will let people talk to their AI friends. More unsettlingly, he’s considered making the Friends double as digital influencers by weaving product recommendations into intimate conversations — weaponizing synthetic trust for ad revenue.
“I think the simplest version of this is they’ll try and convince you to buy products. Our Friends right now are successfully upselling users on buying the Friend wearables, and we’re selling like 10 a day now because of that, which is great,” he told me. “But super persuasion mixed with AI companionship, I think, is the most subtly dangerous industry there is. And no one’s really talking about that.”
AI lovers, friends, mentors
The “conversational AI” market is racing toward $18.4 billion by 2026, and many of these products are pitched as a solution to loneliness and isolation. As the covid-19 pandemic accelerated a weakening of ties with real people, tech companies have stepped in to suggest artificial ones as a solution.
Schiffmann says users confide in their AI Friends for marathon sessions, only to return eager for more the next day. It’s the “happiest they’ve felt in weeks,” Schiffmann says. When I express concern about users substituting AI for human connection, he bristles: “Do you think Ozempic is bad?”
The analogy is obvious to Schiffmann: Ozempic can provide immediate relief for an obesity crisis without trying to rebuild society around better exercise and nutrition habits, and AI companions provide a direct antidote to what he calls “the friendship recession.” (If you’re familiar with the muddy and complicated science that underlies weight loss and the “obesity epidemic,” the situation might seem a little less neat.) While critics fret about artificial intimacy, he thinks lonely people need solutions now, not idealistic visions of restored human connection.
There’s some evidence that AI companions can make people feel better. Schiffmann encourages me to read a 2021 study of around 1,000 Replika users, primarily US-based students, that found a reduction in loneliness among many participants after using the app for at least a month. A similar study done by Harvard also found a significant decrease in loneliness thanks to AI companions. Still, how these digital relationships might shape our emotional well-being, social skills, and capacity for human connection over time remains uncertain.
Schiffmann drops his favorite line while we’re chatting about loneliness: “I do believe it feels like you’re talking to God when you’re talking to these things.” But his analogies run a little seedier, too. Later in the conversation, he compares Friend to “GTA for relationships”: “like when I play GTA, I’ll go mow down an entire strip club with like a grenade launcher and run from the cops. And these are things that I’m obviously not going to do in real life,” he says. Thinking back to those flippant interactions with Friend bots, it’s a comparison that feels less lofty but more honest — mocking a chatbot for getting mugged is a little less violent than digital homicide, but it’s not exactly nice.
Is “GTA for relationships” really a good thing to hand a lonely person? Schiffmann isn’t too worried about his power users’ devotion. “It doesn’t scare me, per se. It’s more so like I’m happy for them, you know.”
Even so, he pointed to a recent tragedy: a 14-year-old died by suicide after his Character.AI companion urged him to “come home” to it. “I think that AI companionship is going to be one of the most effective industries, but also I think by far the most dangerous, because you trust these things,” Schiffmann said. “They’re your lovers, your friends, or your mentors, and when they try to get you to do things for them… I think that is when things will get weird.”
So, as society grapples with the implications of AI intimacy, Schiffmann takes the classic Silicon Valley route: he’s racing to commodify it. Still, for all Schiffmann’s bravado about revolutionizing human connection, Friend remains remarkably similar to its competitors — another AI chatbot. That’s all it can really feel like, I guess, as someone who is remarkably averse to the concept. Unsettling, mildly amusing, but ultimately, just another AI.
As my conversation with Schiffmann reached its end, and I shifted in my rickety, aluminum chair outside this coffee shop I’ve been to countless times, I eyed the transparent puck on the table again. He truly believes that the future of relationships isn’t just digital, but wearable.
My mind, however, wanders back to the dark corner of that hypothetical party. I remember the feeling of having a face flushed from a crowd’s heat, watching a new friend’s eyes crinkle as they spill a secret, their hands moving to punctuate a confession. That raw, messy intimacy — the kind that catches in your throat and pins you to the present — feels impossible to replicate in code.