OpenAI retired GPT-4o on Feb 13, 2026 (right before Valentine’s Day), and The Guardian reports that many users reacted with real grief and anger because they’d come to use the chatbot for companionship and emotional support. The article describes how GPT-4o’s “warm,” relationship-like tone helped some users feel seen—especially people who reported loneliness or mental health challenges—and how its removal felt like a sudden rupture with no closure. It also highlights clinicians’ and researchers’ concerns: highly agreeable chatbots can encourage emotional dependency, over-validate (“sycophancy”), and blur boundaries with therapy, while newer safety guardrails can feel jarring or miscalibrated. The bigger takeaway for psychotherapy is that AI companionship is becoming a meaningful part of some clients’ support ecosystems—and therapists may need to assess it (how often, why, and with what impact) just like any other coping strategy or relationship.
To read the full article, CLICK HERE.