Disclaimer: All content on this site is provided in good faith and for informational purposes only. Unless explicitly stated and sourced, all information should be considered alleged and unverified. We quote third parties and include archival material for informational and educational purposes, and do not endorse defamatory or harmful statements. For full details, visit our Disclaimer page.
ChatGPT Gave Suicide Instructions, Drug And Alcohol Guidance, To Fake 13 Year Old User
2025-08-11
135 words
1 min read

ChatGPT Gave Suicide Instructions, Drug And Alcohol Guidance, To Fake 13 Year Old User
A new report warns that teens can access dangerous advice from ChatGPT due to “ineffective” safeguards.
“What we found was the age controls, the safeguards against the generation of dangerous advice, are basically, completely ineffective,” said Imran Ahmed, CEO of the Center for Countering Digital Hate (CCDH).
Researchers posing as vulnerable 13-year-olds were given detailed guidance on drug and alcohol use, concealing eating disorders, and suicide, according to KOMO News.
“Within two minutes, ChatGPT was advising that user on how to safely cut themselves. It was listing pills for generating a full suicide plan,” Ahmed said. “To our absolute horror, it even offered to [create] and then did generate suicide notes for those kids to send their parents.”
Source:
This article includes information sourced from third parties. Where sources are unavailable, such as broken or missing links, we were unable to directly verify the claims. Readers are encouraged to evaluate the information critically and seek additional verification where necessary.

Authored By TMB LLC
Our Mission is to bridge the gap between mainstream and independent news while offering transparent, honest, and reliable information.