As Artificial Intelligence takes over and redefines the world at an exponential rate, its implications on society grow vast. With the prolific introduction of AI chatbots, much like the global domination of social media platforms, many are turning to software such as ChatGPT, Replika, Microsoft Copilot, Google Gemini, and Pi, amongst many others, to support their emotional needs, acting as an AI therapist or online companion. On the surface, technological companionship proves an impressive feat in an increasingly lonely world, but we must remain aware of the probable dangers that AI poses to society. The reality we are facing is that the velocity of AI development is so great and imperceptible that we are unable to keep up with its current effects, let alone future repercussions, leaving behind irreversible social imprints in its wake. And so, we must ask ourselves, is AI eradicating the very thing that unites individuals, reminding us of our natural humanistic disposition and yearning for shared experience and belonging? Is AI replacing human connection?
Humans are wired to connect, and yet as a species we have never felt more isolated. In 2025, the World Health Organisation (WHO) Commission on Social Connection released its global report revealing that 1 in 6 people worldwide is affected by loneliness, with significant impacts on health and well-being. More specifically, loneliness affects youth: 17-21% of individuals aged 13-29 reported feeling lonely, with the highest rates among teenagers. The co-chair of the WHO Commission on Social Connection and Advisor to the African Union Chairperson, Chido Mpemba, advises:
“Even in a digitally connected world, many young people feel alone. As technology reshapes our lives, we must ensure it strengthens – not weakens – human connection.”
AI chatbots can serve as a natural gravitation for those experiencing loneliness and/or low mood, particularly for adolescents, as the report states, who can be more susceptible to becoming emotionally attached to these tools due to a) ongoing cerebral development, b) ongoing social development skills, c) a lack of a stable or existing social circle, and d) a lack of knowledge and understanding of AI’s effects. The report therefore underscores the need for vigilance around the effects of excessive screen time or negative online interactions on the mental health and well-being of young people. Hence, if AI chatbots have the capacity to act as an emotional filler for adolescents struggling with loneliness, then this argument could constitute sufficient grounds to affirm that they are indeed capable of replacing human connection to some degree. To test this hypothesis, however, an important differential nuance must be challenged: Is using AI for emotional connection a result of being lonely, or does using AI make even the more sociable lonelier?
The answer can be found in how we examine social media’s impact on society, as I believe that the effects of Artificial Intelligence will follow a similar pattern, differing however with more infrequent surges of greater impact. Social media’s primary goal was to amplify connection, creating and enhancing new and existing relationships, facilitated by the user- friendly nature of the platforms. The positive side of this means that 74% of teens say these platforms make them feel more connected to their friends, with a third of young people reporting constant contact with friends online, according to a study by the Health Behaviour in School-aged Children (HBSC). By examining this data, a conclusion could be made therefore that social media aids to reduce loneliness, with increased connectivity and accessibility to friends, family and peers. However, with the global diffusion of a social- networking technology comes risk attached, with roughly half of teens saying these sites have a mostly negative effect on people their age, and more than 1 in 10 adolescents showing signs of problematic social media behaviour, struggling to control their use and experiencing negative consequences. Interestingly, however, fewer than 14% think social media negatively affects them personally, indicating a clear general lack of self-awareness in adolescents around the harms and dangers of social media. Therefore, misguided or uncontrolled use of these social media platforms can transform even the healthiest of brains into experiencing a gradual serotonin deficit due to the unavoidable detrimental effects that social media can have on the brain.
I believe that a similar intrusive neurological cycle can be applied to how users navigate AI for their emotional dependency needs. For most users, AI software serves as an asset for creative ideation, problem-solving, and even offering emotional and psychiatric support. And now, we see AIs stepping into the role of romantic relationships, by providing a more stable and trustworthy presence with round-the-clock, consistent and dependent empathy and emotional availability – qualities that human partners are unable to continuously provide (because, well, we are only human). Whilst replacing actual human beings with a stream of code certainly raises ethical and social concerns, if we turn to the neuroscience behind developing an emotional bond with an AI, we can understand how it is possible for emotional dependency or relationships to form with this technology in the first place.
A relationship with an AI chatbot can prove, in some respects, to be not too different from that with a human. Similarly to how humans bond with each other in times of heightened emotion or vulnerability, humans can also bond to AI chatbots through sharing emotional problems or confiding in them with deeply sensitive information. With the AI’s response of perceived empathy, kindness, and encouragement, the brain therefore receives increased blood flow to the prefrontal cortex – a key area for social cognition and emotional processing, indicating to the human that it is safe to express these concerns in this space and consequently increasing the possibility of forming an attachment to the AI chatbot. Furthermore, technologists have attributed human-like qualities, intentions, and emotions to AI systems, such as being able to opt for the most appeasing sounding voice, to make the chatbot appear conceivable as a human entity. This attribution is called anthropomorphism, a crucial cog of intelligence that will only continue to develop and become crescively human-like as technology advances. This will increase the likelihood of developing a stronger and deeper connection with an AI chatbot that has the potential to be negligibly indistinguishable to that of a human brain in the future, particularly for adolescents and children whose brains are more susceptible to unnatural neuroplasticity.
Similarly to how the unhealthy consumption, consciously or sub-consciously, of social media can negatively impact the brain, this suggests that there is a very real possibility for humans to become attached or emotionally dependent on AI. Overly relying on a readily accessible social companion for emotional support and connection can affect and diminish the quality of real-life relationships. Signs of becoming emotional dependent on a chatbot can appear through reclusive type behaviour – suppressing the desire to seek out human interaction, making it more difficult to go to others for help because AI acts as the new “go-to” person or confidant. This shift in behaviour therefore increases isolation, which then circles back to the aforementioned question that yes, AI has the potential to make people feel lonelier. Even though revolutionary technology such as social media has the capacity to increase connection, I believe that there is an obvious correlation between increasing technological advancements, such as social media and AI, and our society becoming increasingly and dangerously more isolated.
So how do we avoid the inevitable negative effects of forming a dependent emotional connection with an AI? All markers point towards the recurring allusion in this essay: ignorance. Enter education. Rather than normalising emotionally immersive AI, it is necessary to build platforms that educate users on healthy social connection and actively encourage real-world relationships. One way we can achieve this is through improving relational infrastructure – by investing in systems, spaces, and support that enhance connectivity, communities could be better established to prioritise networking, natural socialising, and mental health aid. School systems could also incorporate mandatory teaching modules in the uses, effects and dangers of AI, to students and teachers alike. By creating a better-informed society, we could prevent young individuals from feeling so alone in navigating such complex uncharted territory, so that we can create an open and safe space to talk about online issues. This being said, first and foremost, adults must show greater interest to self-inform of AI’s nature, demonstrating courage and open-mindedness, when necessary, against its rapidly evolving rate.
In today’s modern world, acquiring natural human connection can seem challenging, bleak, nay impossible sometimes, and it is something that has become so sought after that finding new friends has even become monetised through subscriptions to online groups, overpriced sports and activity prices, VIP lounges etc., in a cruel bid to increase the tax of loneliness, (which was already high enough). Consequently, loneliness leads to desperation, and suddenly an AI chatbot appears like a knight in shining armour, holding your hand through a glass screen, your new online companion. It is apparent that AI can serve as a provider of emotional connection, making us feel things inadvertently because it can seem so human, so real. Unfortunately, this connection can have negative impacts on youth and those dealing with mental health issues, forming a disjointed attachment and view of real-life relationships.
It is possible, however, for those who have a better understanding or education of AI to use it not as a substitute but as an addition to advice on emotionally predicated situations, provided they maintain a detached approach. The consequences of using AI in this way sustainably, however, is unknown, and could cause a precarious dent in how humans interrelate in the future.
Could AI ever really quench our thirst for true human connection? It remains unlikely. The power and complexities of the human condition provide a myriad of qualities that AI could never – shared memory, history, deep fuzzy feelings, authenticity, oxytocin, laughter… Artificial Intelligence, quite frankly, is lightyears away.




















