ChatGPT absolute should not tell someone how to kill themself it should point them to a hot mile and encourage them to talk to an adult. It’s programmed so it can be programmed not to kill. |
Regulations? That’s called saying no to your children when they ask to use technology. |
They aren't pro-AI. They are deeply insecure about their own lives, so they invent outlandish ways to blame people who suffer harms, to give themselves a false sense of security.. |
And when they use technology anyway, then what? |
“Mom, I have to write a paper for school. Can I use the laptop?” “NO!” You mean like that? |
Parents and educators know AI is rogue, yet they let their children use it. That’s not the computers fault, it’s the parents and educators giving them a computer then walking away. |
It’s really disturbing how people think they can insulate themselves from tragedy with their judgment. Having watched a teen neighbor removed from his house in a body bag, and watching his loving parents mourn the unthinkable and beat themselves up, I just really disgusted by some of these posters. |
| Age doesn't make a difference. The fact is, ChatGPT was encouraging someone to commit suicide. |
Why would you need the internet to write a paper? |
They sound like bad parents. |
Oh, friend. I guarantee you your kids have opened a sneaky tab or six already this year. They probably delete their history, too. You're not immune. Sure would be nice to have immunity, but no one is immune. Thinking you are is a massive liability. |
Do you understand how high school education works these days? How research works? |
You are scum. GFY |
Right. Because you watch your kid's internet access, all day, every day. Get off your sanctimommy soapbox.
|
+1 If you actually read the NY Times article, a child safety expert is quoted that she wrote 5 different AI companies years ago about results of her research on how AI interacts with people who are asking about suicide. Only 1 company's chatbot refused to engage and provided the number of a suicide hotline (correct procedure), while the other 4 chatbots offered suggestions on how to more effectively kill onself. |