ChatGPT’s safety guardrails allegedly loosened — because clicks matter more than care

A studio image shows a phone with the new ChatGPT ''Search'' feature, while the OpenAI logo is displayed in the background.
A bereaved family suggests OpenAI deliberately weakened ChatGPT's self-harm prevention safety guardrails to drive more user engagement. (Image credit: Getty Images | NurPhoto)

Over the past few months, OpenAI has been in the spotlight a few times for the wrong reasons, predominantly from an increasing number of suicide incidents reportedly fuelled by ChatGPT.

In August, the family of Adam Raine filed a lawsuit against the AI firm after the 16-year-old died on April 11 after discussing suicide with ChatGPT for months. Through their lawyer, the family suggested that OpenAI shipped ChatGPT-4o with safety issues. “The Raines allege that deaths like Adam’s were inevitable:"

Does ChatGPT engagement get precedents over safety?

OpenAI reportedly suicide prevention's safety guardrails to drive more ChatGPT user engagement. (Image credit: Getty Images | Anadolu)

As the matter is still in court, the family lawyer told Financial Times that OpenAI requested a full list of the people who attended Raine's burial, potentially indicating that the firm may "subpoena everyone in Adam’s life”.

We realise this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right. Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases.

OpenAI CEO, Sam Altman

Additionally, the company requested “all documents relating to memorial services or events in the honour of the decedent including but not limited to any videos or photographs taken, or eulogies given . . . as well as invitation or attendance lists or guestbooks”.

I'll keep close tabs on this story as it unfolds and keep you posted with an update and subsequent separate stories. Elsewhere, ChatGPT reportedly pushed a user towards suicide by jumping off a 19-story building prior to convincing the 42-year-old to stop taking their anxiety and sleeping medication.


Click to follow Windows Central on Google News

Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!


Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.