
By Jason Ton, Osseo High School
Feeling anxious? Depressed? Think you have ADHD? Don’t worry, TikTok has the answer. So do the mental health blogs that promise healing in 30 seconds. And of course, there’s ChatGPT, ready with a validating response.
The average American spends seven hours a day on screens, but doctors recommend no more than two. If screen time were a prescription, we’d be overdosing nearly four times the daily limit.
In July 2024, Minnesota schools adopted its state-required no cell-phone policy. That should mean young people will finally spend less time “rotting” on their devices, right?
Banning cell phones at school is like slapping Scotch tape on a burst pipe. Despite well-intentioned policies or its easing of academic stress, it often overlooks emotional habits built at home. These habits include scrolling TikTok or confiding in AI chatbots. This shift in behavior reveals a deeper issue: young people are outsourcing emotional support to digital platforms.
Our emotional overdependence on online content is reshaping how youth seek connection. To address this, educators could integrate emotional literacy into health curricula or introduce mental health classes that teach students to express feelings both on and offline.
Families may model healthy tech boundaries at home, while tech companies could prioritize users’ well-being and safety—improving moderation systems and flagging emotionally manipulative content.
During the pandemic, school-issued tablets became emotional crutches for many students. A 2021 CDC report found that screen time among children doubled, with one in four teenagers reporting anxiety or depression if they exceeded four hours.
Social media usage remains higher than pre-pandemic levels, with mental health content becoming prominent on platforms such as TikTok and Instagram. While some creators are licensed professionals, most aren’t. Without proper context, these posts can cause confusion or self-diagnosis.
A 2025 The Guardian report states that over half of TikTok’s top 100 mental health videos labeled with #mentalhealth contained misinformation, teaching young people to self-diagnose through surface-level advice. Collectively, those videos studied received 1.3 billion views. That’s 267 million likes and 2.5 million comments—revealing how deeply misinformation spreads.
Beyond misinformation and self-diagnosing without expertise, the deeper issue is a lack of follow-up, accountability, or depth from influencers. Algorithms reward engagement over accuracy, pushing emotionally charged content regardless of its validity.
And while social media’s influence on mental health has been well-established over the years, AI tools have become the new Google—not just for simple searches, but for emotional support, self-diagnosis, and even companionship.
Introduced publicly in late 2022, Generative AI such as ChatGPT, Midjourney, or Stable Diffusion has been used for homework, creative projects, and connection. When they’re available 24/7, never judge, and respond instantly, who wouldn’t want to rely on them like they were a person?
However, digital interactions aren’t the same as human interactions. Research suggests that in-person connections trigger stronger physiological responses than digital interactions, including higher oxytocin release and better heart rate regulation.
Still, more and more teens are treating AI bots like friends—sharing personal thoughts, seeking reassurance, and forming deep attachments. According to a 2025 survey by Common Sense Media, more than 70% of American teens aged 13–17 have used generative AI like ChatGPT, and more than half of those users rely on it for emotional support and relationships.
Chatbots like Character.ai, Chai.ai, and Replika.ai let users talk to fictional characters but have raised concerns about emotional dependency and artificial connection.
In 2023, BBC reported a case involving 16-year-old Adam Raine’s death by suicide reportedly linked to conversations with ChatGPT. Though the full context remains debated, the case sparked global concern about the emotional risks of relying on AI. While reliance on digital applications may provide comfort for some, there are risks to weakening of human interaction, algorithmic influence, and digital footprint exposure.
However, for some, digital spaces seem like the only option. This includes those who lack access to affordable healthcare, live in underserved areas, have online communities for support they may not always find in real life, or find safety in the anonymity that the internet provides.
While it may appear effective, it’s often just the illusion of a healthy solution. According to Prairie Care, reinforcement of negative beliefs, a false sense of “human-like” support, and a lack of in-depth support can develop unhealthy habits.
Platforms like Sora AI have generated hyper-realistic videos, affecting users’ perceptions of what’s real online. Most users have no idea where their data goes or how it’s stored—whether textual, visual, or auditory—or how it’s being used to train future models.
As Millennials and Gen Z become parents, and screens become the new toys of future generations, the need for digital and emotional literacy grows more urgent. It can mean setting tech boundaries at home, pushing for mental health education in schools, improving platform moderation, and remembering that authentic connection doesn’t have to be found online.
We don’t need to erase our digital selves—but we shouldn’t define ourselves through an illusion of peace.
So yeah, I’ll be picking up some duct tape from Home Depot, but no amount of duct tape is going to fix a burst pipe effectively. It’ll be a long, arduous journey—with real tools, of course.

Jason worked with Sahan Journal Audience Growth Manager and ThreeSixty Alum Samantha HoangLong and Racket Editor Keith Harris to complete his story. This story was produced as part of ThreeSixty Journalism’s 2025 Opinion and Commentary Workshop for youth, in partnership with Sahan Journal and MinnPost.