Students Using AI as Friends Are Getting Lonelier
15% of UK undergrads now use AI for companionship. New research shows the comfort comes at a cost — eroding real-world social connections.

One in seven UK undergraduates now uses AI chatbots not for homework, but for friendship. The 2026 HEPI survey of 1,054 students found 15% turn to AI for companionship, advice, or loneliness. In the US, 16% of adults have done the same, per The Atlantic.
The uncomfortable part: students who lean on AI for emotional support are becoming more isolated from the humans around them.
The paradox nobody warned about
Researchers at IIM Kozhikode recruited 167 Gen Z college students and measured two things: psychological wellbeing (happiness, sense of purpose) and social wellbeing (connection to friends, family, community).
The chatbots improved psychological wellbeing. Students felt heard, understood, less anxious. But those same conversations degraded social wellbeing. Students who spent more time with AI companions reported weaker connections to real friends and less motivation to maintain offline relationships.
"Users admitted they were spending so much time talking to their digital companions that they felt disconnected from their real-life friends," the researchers wrote in Psychology & Marketing. The chatbot filled social needs so completely that students stopped navigating the messier, less predictable world of human interaction.
A CHI 2026 paper reinforced this. Researchers tracked Reddit users before and after their first interaction with AI companions like Replika. Heavy users showed increasing loneliness, depression, and suicidal ideation over time — even as they said chatbots helped them express emotions they couldn't share with anyone else.
Why students choose bots over people
The appeal is obvious. Late-night exam anxiety. First semester in a new city. The loneliness of competitive environments where admitting struggle feels like weakness.
AI companions don't judge. They don't get tired. They don't cancel plans. For students already dealing with AI pressures on academic honesty, extending that relationship into emotional territory feels natural.
The HEPI survey captured the split perfectly. 20% of students said AI makes them lonelier. 21% said it makes them less lonely. Same technology, opposite outcomes. The difference: whether AI supplements human contact or replaces it.
One student: "AI tools let me summarise dense readings and generate drafts, saving hours of tedious work."
Another: "I'm not using my brain at all."
Same campus. Same course. Possibly the same tutorial group. The gap between using AI well and using it as a crutch is paper-thin.
Universities are running behind
Only 20% of universities worldwide have a formal AI policy, per a February 2026 Coursera survey of 4,200 students and educators. At K-12 level, just 31% of US public schools have one. On AI companionship — students forming emotional bonds with chatbots — the policy vacuum is near-total.
The disconnect: 68% of students say AI skills are essential. Only 48% feel staff help them develop those skills. Just 36% feel their institution encourages AI use at all.
The American Federation of Teachers launched a $23 million AI training academy in March, covering 400,000 educators. The White House released an AI legislative blueprint calling for skills training. Both treat AI as a learning tool. Neither addresses AI as an emotional relationship.
The students most likely to bond emotionally with AI are often the ones who need human support most. Heavy daily ChatGPT use correlates with increased loneliness — a finding from an OpenAI researcher's own April 2025 paper. When the company building the tool publishes research showing it isolates users, that's not speculation.
The global dimension
The story splits along economic lines. In wealthy nations, the worry is students replacing human friendships with AI. In countries where fuel crises are shutting down schools, AI companionship isn't on the radar — access to any education is the emergency.
Asia-Pacific is the fastest-growing AI education market, expanding at 48% annually. India — where the IIM Kozhikode study ran — has a student population under particular stress: relocation for university, cutthroat entrance exams, and minimal campus mental health support.
The AI education market hit $7.05 billion in 2025, projected to reach $136.79 billion by 2035. Microsoft invested $4 billion last year through its Elevate Academy. AWS is sponsoring 100,000 learners in 2026. The money flowing into AI-as-teacher vastly outpaces investment in understanding AI-as-friend.
What happens next
HEPI's recommendations are sensible: structured AI induction, curricula that teach AI knowledge, clear assessment guidance, research into AI companionship's impact.
Recommendations are easy. The harder question: can institutions built around human relationships — tutorials, office hours, student unions — compete with a chatbot that never judges, never tires, and costs nothing at 3 AM?
Students who use AI as a supplement will probably be fine. The ones quietly shifting their emotional lives to a screen because human connection feels too hard — that's where the cost shows up. Right now, almost nobody in authority is watching for it.
Sources & Verification
Based on 5 sources from 4 regions
- HEPIEurope
- Psychology & MarketingSouth Asia
- The AtlanticNorth America
- Times of IndiaSouth Asia
- PsyPostInternational
Get the daily briefing free
News from 7 regions and 16 languages, delivered to your inbox every morning.
Free · Daily · Unsubscribe anytime
🔒 We never share your email


