Students Using AI as Friends Are Getting Lonelier
15% of UK undergrads now use AI for companionship. New research shows the comfort comes at a cost — eroding real-world social connections.

One in seven UK undergraduates now uses AI chatbots not for homework, but for friendship. The 2026 HEPI Student Generative AI Survey, based on 1,054 full-time students polled in December 2025, found that 15% turn to AI for companionship, advice, or to address loneliness. In the US, the figure is similar — 16% of American adults have used AI for companionship, according to a survey cited by The Atlantic this month.
The uncomfortable part: a growing body of research shows that the students who lean on AI for emotional support are becoming more isolated from the humans around them.
The paradox nobody warned about
A study published in Psychology & Marketing by researchers at the Indian Institute of Management Kozhikode tested what happens when people talk to emotionally intelligent chatbots. They recruited 167 Generation Z college students and measured two things: psychological wellbeing (internal happiness, sense of purpose) and social wellbeing (connection to friends, family, community).
The chatbots improved psychological wellbeing. Students felt heard, understood, less anxious. But the same conversations degraded social wellbeing. Students who spent more time with AI companions reported weaker connections to real-world friends and less motivation to maintain offline relationships.
"Users admitted they were spending so much time talking to their digital companions that they felt disconnected from their real-life friends," the researchers found. The digital companion was fulfilling social needs so completely that students no longer felt the push to navigate the messier, more unpredictable world of human interaction.
A separate paper accepted at the ACM CHI 2026 Conference reinforced this. Researchers tracked Reddit users before and after their first documented interaction with AI companions like Replika, using causal inference techniques borrowed from economics. Heavy users showed increasing signals of loneliness, depression, and suicidal ideation over time — even as they reported that chatbots helped them express emotions they couldn't share with anyone else.
Why students choose bots over people
The appeal is easy to understand. Late-night exam anxiety. First semester in a new city. The particular loneliness of competitive academic environments where admitting struggle feels like weakness.
AI companions don't judge. They don't get tired. They don't cancel plans. They remember your last conversation and ask follow-up questions. For students already dealing with the pressures that AI tools put on academic honesty, extending that relationship into emotional territory feels like a natural next step.
The HEPI survey captured the polarisation perfectly. Among students, 20% said AI makes them feel lonelier. Almost the exact same proportion — 21% — said AI makes them feel less lonely. Same technology, opposite experiences. The difference comes down to whether AI supplements human contact or replaces it.
One student told HEPI researchers: "AI tools allowed me to quickly summarise dense readings and generate drafts for assignments, saving hours of tedious work and letting me focus on critical analysis and deeper understanding."
Another said: "I'm not using my brain at all."
Those two statements could come from the same campus, the same course, possibly the same tutorial group. The gap between using AI well and using AI as a crutch is paper-thin, and most institutions aren't helping students tell the difference.
Universities are running behind
Only 20% of universities worldwide have a formal AI policy, according to a February 2026 Coursera survey of 4,200 students and educators across five countries. At the K-12 level, just 31% of US public schools have a written AI policy. When it comes to AI companionship specifically — students forming emotional bonds with chatbots — the policy vacuum is near-total.
The HEPI data spells out the disconnect: 68% of students believe AI skills are essential to thrive today, but only 48% feel their teaching staff are helping them develop those skills. Only 36% feel their institution encourages AI use at all.
Meanwhile, the American Federation of Teachers launched a $23 million National Academy for AI Instruction on March 18, 2026, training 400,000 educators on how to use AI in classrooms. The White House released an AI legislative blueprint on March 20 calling for AI skills training legislation. Both focus on AI as a learning tool. Neither addresses AI as an emotional relationship.
This matters because the students most likely to form emotional bonds with AI are often the ones who need human support most. As CNBC reported earlier this month, heavy daily ChatGPT use correlates with increased loneliness — a finding from an OpenAI product policy researcher's own April 2025 paper. When the company building the tool publishes research showing it can isolate users, that's not a speculative concern.
The global dimension
The story splits along economic lines. In wealthy nations, the worry is that students are replacing human friendships with AI. In countries dealing with fuel crises shutting down schools, AI companionship isn't even on the radar — access to any form of education is the immediate emergency.
Asia-Pacific is the world's fastest-growing AI education market, expanding at 48% compound annual growth, driven by populations where educational achievement carries enormous cultural weight and competitive academic pressure is extreme. India, where the IIM Kozhikode chatbot study was conducted, has a student population under particular stress: relocation for university, cutthroat entrance exams, and limited mental health infrastructure on campus.
The AI in education market hit $7.05 billion in 2025 and is projected to reach $136.79 billion by 2035. Microsoft invested $4 billion last year in AI education through its Elevate Academy. AWS is sponsoring 100,000 learners in 2026 through its AI & ML Scholars program. The money flowing into AI-as-teacher vastly outpaces investment in understanding AI-as-friend.
What happens next
The HEPI report's recommendations are sensible: structured AI induction for all students, curricula that explicitly teach AI knowledge, clear assessment-specific guidance, and research into the impact of students using AI for friendship and loneliness.
But recommendations are easy. The harder question is whether institutions built around human relationships — tutorials, office hours, student unions, group projects — are ready to compete with a chatbot that never judges, never gets tired, and costs nothing extra to talk to at 3 AM.
The students who use AI well — as a supplement to human thinking, a research accelerator, a first draft generator — will probably be fine. The ones who quietly shift their emotional lives to a screen because human connection feels too hard? That's where the real cost emerges.
And right now, almost nobody in a position of institutional authority is watching for it.
Sources & Verification
Based on 5 sources from 4 regions
- HEPIEurope
- Psychology & MarketingSouth Asia
- The AtlanticNorth America
- Times of IndiaSouth Asia
- PsyPostInternational
Keep Reading
Only 6% of Companies Train Workers for AI. Why?
89% of leaders call AI skills critical. 6% have started training anyone. The gap between AI investment and workforce preparation is a policy failure in real time.
95% of Students Use AI on Assignments. Now What?
A major UK survey found 95% of undergrads use AI for assessed work, up from near-zero three years ago. Universities are scrambling to respond as students say AI is 'making us all lazy.'
AI Tutors Are Replacing Teachers. The Evidence Isn't In.
El Salvador, Kazakhstan, and 8 more nations are deploying AI tutors at national scale. One Harvard study supports it. That's about it.
Explore Perspectives
Get this delivered free every morning
The daily briefing with perspectives from 7 regions — straight to your inbox.
Free · Daily · Unsubscribe anytime
🔒 We never share your email