Daily News Lesson

SHOW ALL

Sept. 2, 2025, 4:10 p.m.

What to know about AI and mental health

Warning: This video contains discussion of suicide. We recommend that teachers review the segment before sharing with their students.

NOTE: If you are short on time, watch the video and complete this See, Think, Wonder activity: What did you notice? What did the story make you think about? What would you want to learn more about?

SUMMARY

The parents of a teenager who died by suicide have filed a wrongful death suit against ChatGPT owner OpenAI, saying the chatbot discussed ways he could end his life after he expressed suicidal thoughts. The lawsuit comes amid reports of people developing distorted thoughts after interacting with AI chatbots, a phenomenon dubbed “AI psychosis.” John Yang speaks with Dr. Joseph Pierre to learn more.

View the transcript of the story.

News alternative: Check out recent segments from the NewsHour, and choose the story you’re most interested in watching. You can make a Google doc copy of discussion questions that work for any of the stories here.

WARM-UP QUESTIONS

  1. What company is the target of a lawsuit after a teen's suicide?
  2. How did AI enable a teen's suicide, according to the lawsuit?
  3. What are the symptoms of so-called "AI psychosis"?
  4. Who is Dr. Joseph Pierre, and what is his background?
  5. Why can AI use lead to "AI psychosis," according to Dr. Pierre?

ESSENTIAL QUESTIONS

After watching this segment, do you think AI is having a negative impact on most people's mental health, or do you think AI might be just a danger for people with serious mental health challenges? Why do you think so?

  • Do you think AI has the potential to strain people's mental health more than social media or other kinds of online media? What are some of the differences?
  • Do you know who you can talk to if you feel depressed, anxious or upset? What resources are available to you that allow you to connect with real people to address concerns you may have?

Media literacy: Why do you think this segment focuses on one company and one use of AI? What other stories might you want to hear about to assess the impact of AI on mental health?

WHAT STUDENTS CAN DO

Closely read the statement given from ChatGPT to PBS News Hour. Then as a class, discuss:

  • Do you think the spokesperson's response addresses the questions and concerns raised in the segment?
  • In the statement, does ChatGPT point to any action that it is taking to address concerns raised in the segment?
  • Brainstorm what actions you think the company should take to address concerns about its product's impact on mental health. What policy, restrictions or resources do you think might address some of the problem?
  • Finally, do you think government agencies should regulate the use of AI in any way to protect public health? If so, what government policies would help address the problem?

Fill out this form to receive our weekly newsletter or share your thoughts on Classroom’s resources.

SUPPORTED BY VIEWERS LIKE YOU. ADDITIONAL SUPPORT PROVIDED BY:

Copyright © 2025 NewsHour Production LLC. All Rights Reserved

Illustrations by Annamaria Ward