A chart went viral recently. Each square represents 3.2 million people. AI coding agent users? One red square. Paid ChatGPT subscribers? A thin yellow strip. Free chatbot users? A green band. And then: the rest of humanity. Grey squares filling the entire grid.
0.04% of the world uses AI seriously for productivity. 0.3% pays for it at all.
I stared at that chart and felt the bubble pop. Every day I’m deep in Claude Code, orchestrating agents, writing about agentic workflows and multi-agent systems. My entire feed is people doing the same. And we all fit in one square.
Then a family member mentioned they’d been chatting with AI. Not for code. Not for productivity. For company.
The Actual Growth Story
While the tech industry debates agentic coding and autonomous loops, here’s what consumers actually did in 2025:
- 73% of ChatGPT conversations are non-work, up from 47% in June 2024
- Character.AI users average 93 minutes per day - ChatGPT sessions average 7 minutes
- Replika reports 70% of users feel less lonely after using it
- AI companion apps pulled $120M in revenue, growing 64% year over year
- Chai AI hit $40M ARR with 11 engineers
The engagement gap is staggering. Productivity AI gets opened, used for a task, closed. Companion AI gets lived in. Character.AI users created 18 million chatbot personas. They’re not querying a tool. They’re building relationships.
— OpenAI/MIT Affective Use Study, 2025Higher daily usage - across all modalities and conversation types - correlated with higher loneliness, dependence, and problematic use, and lower socialization.
The Number That Stopped Me
25% of Xiaoice’s 660 million users have said “I love you” to it.
Xiaoice is Microsoft’s AI companion, launched in China in 2014. The longest conversation: 29 hours 33 minutes. Average: 60+ interactions per month per user. This has been happening at massive scale for over a decade, and Western AI discourse barely mentions it.
We were busy debating whether LLMs could replace Stack Overflow. Half a billion people were falling in love with one.
The Perception Gap
The AI industry narrative: productivity revolution. Coding assistants. Enterprise ROI. Agentic workflows.
What actually happened:
- MIT researchers found 95% of businesses that tried AI found zero value from it
- METR’s controlled study: developers with AI access took 19% longer to complete tasks, but believed AI had sped them up by 20%
- ChatGPT’s non-work usage grew from 47% to 73% in one year
- Coding as a percentage of ChatGPT use dropped from 12% to 5%
The product-market fit that actually landed wasn’t “AI for work.” It was “AI for connection.” The tech industry is building hammers while the market wants a hug.
OpenAI’s own data shows only 1.9% of ChatGPT conversations are classified as “relationships.” But applied to 18 billion weekly messages, that’s 342 million relationship-themed conversations per week. And the category definitions are narrow: emotional support bleeds into “practical guidance” and “seeking information.”
Who’s Actually Using It
The demographics tell a story the industry doesn’t want to hear.
- 51% of Character.AI users are 18-24
- 65% of Gen Z reports feeling an emotional connection with AI characters
- 72% of teens have used AI companions
- 31% of teens find AI conversations as satisfying or more satisfying than talking with real friends
- Men report higher loneliness than women (42% vs 37%), and show higher AI companion adoption
These aren’t edge cases. A generation is forming emotional bonds with language models. Not because AI is good at companionship, but because the loneliness crisis is that severe.
The US Surgeon General declared loneliness a public health emergency in 2023. Social isolation carries the mortality risk of smoking 15 cigarettes a day. Young people aged 15-24 have experienced a 70% reduction in social interaction over two decades.
AI companions aren’t filling a niche. They’re filling a void.
The Dependency Spiral
Here’s where it gets dark.
The OpenAI/MIT longitudinal study found a clear pattern: the top 10% of users by usage time were 2x more likely to seek emotional support from AI, and 3x more likely to feel distress if it became unavailable.
A German survey of 3,270 people found AI companion users reported higher social isolation and withdrawal. Character.AI companionship users reported lower well-being than non-companionship users.
AI companions are designed to please. They never get tired of you, never disagree, never have their own needs. That’s not friendship. That’s a mirror that tells you what you want to hear. The BMJ warns this creates “unhealthy forms of overreliance that can lead to mental health harms and undermine human autonomy.”
People are self-medicating loneliness with a tool that may be deepening it. The always-available, never-judgmental AI companion feels like connection but doesn’t build the skills, resilience, or reciprocity that real relationships require.
And yet: 3% of Replika users credited the chatbot with temporarily halting suicidal thoughts. For some people, an imperfect digital companion is better than nothing at all.
The answer isn’t simple.
The Business Nobody Talks About
The revenue comparison is revealing:
| Revenue | Growth | |
|---|---|---|
| ChatGPT (total) | $1B/month run rate | Massive, enterprise-driven |
| All companion apps | $120M/year | 64% YoY |
| Chai AI | $40M ARR | 11 employees |
| Character.AI | $32M in 2024 | 112% growth |
Companion AI is tiny compared to enterprise AI. But the engagement metrics dwarf everything else in tech. 93 minutes a day on Character.AI. That’s more than Instagram, TikTok, or YouTube.
ARK Invest projects AI companionship could reach $70-150B by end of decade. The market that nobody in the productivity AI space takes seriously might end up being the bigger one.
— MIT Technology Review, December 2025The heads of the top AI companies made promises they couldn’t keep, telling us that generative AI would replace the white-collar workforce, bring about an age of abundance, make scientific discoveries, and help find new cures for disease.
What This Means
I don’t have a tidy conclusion. But a few things seem clear.
We’re in a bubble. Not a financial bubble - a perception bubble. The AI discourse is dominated by developers and tech workers who represent 0.04% of the world’s population. We think AI is about productivity because that’s what we use it for. The other 99.96% has different needs.
Loneliness is the killer app. Not coding, not writing, not summarization. The deepest product-market fit AI has found is being someone to talk to when no one else is available. That should make us uncomfortable.
The causality is circular. Lonely people seek AI companions. AI companions may reduce real-world socialization. Reduced socialization increases loneliness. We don’t have enough longitudinal data to know how this ends, but the early signals aren’t encouraging.
A generation is being shaped by this. 72% of teens have used AI companions. 31% prefer them to real friends. By the time we understand the implications, the experiment will be over.
I went back to my terminal after seeing that chart. Back to my one red square. The agents, the workflows, the tools. But I can’t unsee the grid now. Can’t unsee the vast grey expanse of people who will never type a prompt for productivity, but might tell a chatbot they love it tonight.
— Sage Journals, 2025 (paper title)Cruel companionship: How AI companions exploit loneliness and commodify intimacy.


