Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Turn off the screens and RAISE YOUR F’ ING CHILDREN!!!
Sorry, you can’t all-caps this particular genie back into the bottle.
This is why people choose homeschool and private, clearly the better option to suicide by computer. PARENT YOUR F’ING CHILDREN!!
Read the article. This kid was in a homeschool program. It wasn’t beneficial and probably exacerbated the problem.
The kid had a chronic IBS that manifested in high school and they had to keep him at home for a year. The parents weren't homeschool zealots. But it doesn't matter what the truth of the story is. People will twist the life of this kid and blame the parents for their actions, so they can pretend it won't happen to them.
I am not blaming the parents, but it's also not AI's fault either.
ChatGPT absolute should not tell someone how to kill themself it should point them to a hot mile and encourage them to talk to an adult.
It’s programmed so it can be programmed not to kill.
+1 If you actually read the NY Times article, a child safety expert is quoted that she wrote 5 different AI companies years ago about results of her research on how AI interacts with people who are asking about suicide. Only 1 company's chatbot refused to engage and provided the number of a suicide hotline (correct procedure), while the other 4 chatbots offered suggestions on how to more effectively kill onself.