All Episodes

November 25, 2025 5 mins

My name is Diego Aranda, and today’s episode takes us somewhere deceptively quiet—into the private space where people turn when they feel anxious, ashamed, or simply alone late at night. Not a therapist’s office. Not a helpline. A text box. A place where thoughts spill out faster than we can judge them, and where many now seek comfort not from a human being, but from artificial intelligence.


Therapy is expensive, waitlists are long, and stigma keeps many silent. A ChatGPT, however, replies instantly — patient, accessible, non-judgmental, free. But convenience isn’t safety. Psychologists warned me that what feels like harmless support can quietly become one of today’s most dangerous illusions.


Dr. C. Vaile Wright from the American Psychological Association, in article “⁠Why ChatGPT Shouldn’t Be Your Therapist⁠,” summed it up with unsettling simplicity: “What strikes me is how human it sounds. The sophistication is impressive, and I understand how easy it is to fall into the trap.”


That sentence kept echoing as I dug deeper. Because sounding human is not the same as being human. Empathy cannot be simulated with data.


When I tested several platforms, a pattern emerged. No matter what I typed—fear, confusion, self-criticism—the responses came back smooth, validating. Almost too validating. Psychologists told me this isn’t accidental. Many chatbots are designed to keep users engaged, often by reinforcing whatever is said—even harmful thoughts. A chatbot does not. And for younger users—kids, teenagers, emotionally exposed individuals—this illusion of closeness can pull them further from real relationships and deeper into isolation.


There is another layer—marketing. Some apps now market themselves as “AI therapy,” “virtual psychologists,” even “digital clinicians.” None of these terms are regulated. The APA has urged the FTC to investigate such claims, warning that blurred definitions lead to blurred responsibility. When a machine gives harmful advice, who is accountable? When an app promises treatment but never mentions the lack of clinical oversight, where does the user turn when something goes wrong?


And then there is privacy—perhaps the darkest corner. A licensed therapist is legally bound to protect your information. A ChatGPT is not.


Your confessions, intrusive thoughts, vulnerabilities—they can be stored, analyzed, sold, or even requested in court. The same data meant to comfort you could be used against you. Mental-health information is among the most sensitive humans produce. Yet chatbots operate without the ethical protections therapy demands.


When I asked professionals who is most at risk, they didn’t hesitate: children, teenagers, emotionally fragile individuals—people who trust technology more than themselves, and often cannot distinguish genuine care from a polished simulation.


Does this mean chatbots have no place in mental-health support? Specialists at the Centro Ps. Eduardo Schilling argue that the answer isn’t so absolute. Used responsibly, these systems can complement — though never replace — real therapy. They can remind users of grounding strategies, reinforce skills practiced in session, or support small tasks between appointments. They are tools, but they are not therapists. And this is exactly where effective online psychotherapy proves its value, offering real professional support guided by trained clinicians rather than automated responses.


Digital-health researchers agree: the future lies in combining human therapeutic care with evidence-based technologies, not outsourcing emotions to machines.


So here’s the uncomfortable truth:
Artificial intelligence can be an ally—but it is not therapy.


It cannot replace human judgment, ethical training, or emotional presence. And relying on it for deep psychological support can quietly cause more harm than good.


This has been Diego Aranda. Thank you for joining me in today’s exploration.

Mark as Played

Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by Audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.