By Kenya Godette聽

Students are turning to AI chatbots to help them manage the stressors of transitioning to college life, according to Joy Himmel, Psy.D., director of Counseling Services at Old Dominion University. While this can offer immediate relief and reduce feelings of isolation, Dr. Himmel cautions they are no substitute for professional therapy.聽

Close to 50% of students at the 黑料正能量 reported loneliness as a key concern in the University鈥檚 2023 National College Health Assessment.聽

Dr. Himmel notes first-year students are more likely to explore these digital tools. The transition to college often brings uncertainty and students accustomed to on-demand technology are more likely to seek out solutions through apps and chatbots. 鈥淭hey want mental health support in real time. It鈥檚 brief, solution-based care they鈥檙e after,鈥 she said.聽

Still, she cautions that overreliance on AI can be risky. 鈥淎I can鈥檛 account for non-verbal cues or ask clarifying questions. If students rely exclusively on these platforms, it may delay them from seeking the professional care they truly need,鈥 she said.聽

One solution the Office of Counseling Services adopted in spring 2025 is TalkCampus, an online global mental health community where students can access live chat groups, journaling and wellness resources 24/7. While it is not AI-based, the tool reflects the Office of Counseling Services鈥 broader effort to provide accessible mental health options that feel more familiar and immediate.聽

鈥淎I is not a substitute for professional judgment,鈥 said Dr. David Spiegel, professor and chairman of Psychiatry and Behavioral Sciences. 鈥淚t might be a starting point for non-complex concerns but ultimately, a referral to a trained clinician is essential.鈥澛

David Spiegel, MD, professor and chairman of Psychiatry and Behavioral Sciences in Macon & Joan Brock Virginia Health Sciences Eastern Virginia Medical School at Old Dominion University, echoes Himmel鈥檚 concerns about overreliance on AI for mental health issues聽

While he acknowledges the promise of AI in delivering emotional support, cognitive behavioral therapy techniques and psychoeducation, he stresses its limitations. 鈥淎I language models may struggle with the nuanced, context-specific nature of psychiatric conditions,鈥 he said.聽

Dr. Spiegel worries about AI鈥檚 ability to detect red flags like suicidal ideation. 鈥淎 clinician would ask more questions, probe deeper,鈥 he said. 鈥淎 chatbot might take a statement like 鈥業鈥檓 feeling depressed鈥 at face value, without recognizing potentially dangerous intentions behind a follow-up request.鈥澛

The immediacy of AI tools also introduces a new challenge for therapists: managing expectations. 鈥淪tudents get fast, empathetic responses from chatbots,鈥 Dr. Spiegel said. 鈥淭hat can set a high bar for human clinicians, who may not always be instantly available.鈥澛

Both Dr. Himmel and Dr. Spiegel agree that AI could play a more integrated role in the future of behavioral healthcare but for now, they urge caution.聽

鈥淎I is not a substitute for professional judgment,鈥 Dr. Spiegel said. 鈥淚t might be a starting point for non-complex concerns but ultimately, a referral to a trained clinician is essential.鈥澛

Dr. Himmel added that counseling centers must strike a careful balance between embracing technology and encouraging real-world connection. 鈥淲e have to adapt. It鈥檚 about meeting students where they are, both digitally and emotionally,鈥 she said.聽