The bleakest thing I've ever read
NYT: What my daughter told ChatGPT before she took her life
Summary: young woman commits suicide, and her family discovers that she has been relying on ChatGPT for therapy. ChatGPT, of course, makes suggestions but doesn't intervene when a person expresses suicidal ideation (as a real therapist would). On further review they discover that her suicide note had been written or heavily edited by ChatGPT.