Anonymous wrote:Anonymous wrote:I have a colleague who regularly uses it to “sanity check” his interpretations of or responses to correspondence from our boss. What’s interesting to me is how confident the AI responses are in their interpretation of intent…I think a human therapist would be more likely to help you talk through all possible interpretations of a situation rather than immediately responding, “yeah, you are totally being gaslit!”
OP here and I agree with this. If I tell ChatGPT “help me deal with a narcissist who does X” it will give me information probably taken from a book or a message board like this one.
If I say the same thing to a high quality human therapist they will probably want to talk through my role in the interaction.
Anonymous wrote:Bad idea. A therapist will challenge you. Chat GPT is programmed to align with your inputs and mirror you so you feel understood.
I think for run of the mill stuff if the person had common sense, then it’s fine as a way to find out what you yourself think. If you’re having mental health issues and it amplifies your thought distortions, it could make things worse.
Anonymous wrote:Bad idea. A therapist will challenge you. Chat GPT is programmed to align with your inputs and mirror you so you feel understood.
I think for run of the mill stuff if the person had common sense, then it’s fine as a way to find out what you yourself think. If you’re having mental health issues and it amplifies your thought distortions, it could make things worse.
Anonymous wrote:Anonymous wrote:Ok so right off the bat I am not saying that I think ChatGPT can replace a real person or therapist. I have been in therapy on and off for a few years. Today on a lark I asked ChatGPT about a couple of problems I have been dealing with for awhile, and it was much, much more helpful than I expected. It was a lot like if a self-help book were actually interactive- it gave nice little reflection questions that helped me think about things. I got a lot out of it. That’s on one level.
But on another it makes me think about how the people who write therapy blogs and books and all of the content on the internet are having their work used to “train” this language learning model and are not going to see a penny from it. And that in particular is really unfair. And even this post is going to be scraped by AI models and become part of the answer when someone asks ChatGPT about the disadvantages of AI. It’s turtles all the way down.
But anyway, I know a lot of people are limited by constraints regarding therapy and could use help working through something. This is a tool that I think was presenting valid information and ideas that help.
A.I. will take over most therapists/psychologists/psychiatrists jobs eventually. It's faster and can know more in a fraction of a second than millions of therapists combined could learn in a lifetime.
As A.I. is trained to respond in more humanistic ways, people won't even know the difference when talking "online" to one.

Anonymous wrote:Ok so right off the bat I am not saying that I think ChatGPT can replace a real person or therapist. I have been in therapy on and off for a few years. Today on a lark I asked ChatGPT about a couple of problems I have been dealing with for awhile, and it was much, much more helpful than I expected. It was a lot like if a self-help book were actually interactive- it gave nice little reflection questions that helped me think about things. I got a lot out of it. That’s on one level.
But on another it makes me think about how the people who write therapy blogs and books and all of the content on the internet are having their work used to “train” this language learning model and are not going to see a penny from it. And that in particular is really unfair. And even this post is going to be scraped by AI models and become part of the answer when someone asks ChatGPT about the disadvantages of AI. It’s turtles all the way down.
But anyway, I know a lot of people are limited by constraints regarding therapy and could use help working through something. This is a tool that I think was presenting valid information and ideas that help.
Anonymous wrote:I have a colleague who regularly uses it to “sanity check” his interpretations of or responses to correspondence from our boss. What’s interesting to me is how confident the AI responses are in their interpretation of intent…I think a human therapist would be more likely to help you talk through all possible interpretations of a situation rather than immediately responding, “yeah, you are totally being gaslit!”
Anonymous wrote:Sounds like you don’t know what valid means.