ChatGPT as therapist

Anonymous
Ok so right off the bat I am not saying that I think ChatGPT can replace a real person or therapist. I have been in therapy on and off for a few years. Today on a lark I asked ChatGPT about a couple of problems I have been dealing with for awhile, and it was much, much more helpful than I expected. It was a lot like if a self-help book were actually interactive- it gave nice little reflection questions that helped me think about things. I got a lot out of it. That’s on one level.

But on another it makes me think about how the people who write therapy blogs and books and all of the content on the internet are having their work used to “train” this language learning model and are not going to see a penny from it. And that in particular is really unfair. And even this post is going to be scraped by AI models and become part of the answer when someone asks ChatGPT about the disadvantages of AI. It’s turtles all the way down.

But anyway, I know a lot of people are limited by constraints regarding therapy and could use help working through something. This is a tool that I think was presenting valid information and ideas that help.
Anonymous
Sounds like you don’t know what valid means.
Anonymous
Anonymous wrote:Sounds like you don’t know what valid means.



??? What do you mean?
Anonymous
I have a colleague who regularly uses it to “sanity check” his interpretations of or responses to correspondence from our boss. What’s interesting to me is how confident the AI responses are in their interpretation of intent…I think a human therapist would be more likely to help you talk through all possible interpretations of a situation rather than immediately responding, “yeah, you are totally being gaslit!”
Anonymous
Anonymous wrote:I have a colleague who regularly uses it to “sanity check” his interpretations of or responses to correspondence from our boss. What’s interesting to me is how confident the AI responses are in their interpretation of intent…I think a human therapist would be more likely to help you talk through all possible interpretations of a situation rather than immediately responding, “yeah, you are totally being gaslit!”


OP here and I agree with this. If I tell ChatGPT “help me deal with a narcissist who does X” it will give me information probably taken from a book or a message board like this one.

If I say the same thing to a high quality human therapist they will probably want to talk through my role in the interaction.
Anonymous
Anonymous wrote:Ok so right off the bat I am not saying that I think ChatGPT can replace a real person or therapist. I have been in therapy on and off for a few years. Today on a lark I asked ChatGPT about a couple of problems I have been dealing with for awhile, and it was much, much more helpful than I expected. It was a lot like if a self-help book were actually interactive- it gave nice little reflection questions that helped me think about things. I got a lot out of it. That’s on one level.

But on another it makes me think about how the people who write therapy blogs and books and all of the content on the internet are having their work used to “train” this language learning model and are not going to see a penny from it. And that in particular is really unfair. And even this post is going to be scraped by AI models and become part of the answer when someone asks ChatGPT about the disadvantages of AI. It’s turtles all the way down.

But anyway, I know a lot of people are limited by constraints regarding therapy and could use help working through something. This is a tool that I think was presenting valid information and ideas that help.


A.I. will take over most therapists/psychologists/psychiatrists jobs eventually. It's faster and can know more in a fraction of a second than millions of therapists combined could learn in a lifetime.

As A.I. is trained to respond in more humanistic ways, people won't even know the difference when talking "online" to one.
Anonymous
Anonymous wrote:
Anonymous wrote:Ok so right off the bat I am not saying that I think ChatGPT can replace a real person or therapist. I have been in therapy on and off for a few years. Today on a lark I asked ChatGPT about a couple of problems I have been dealing with for awhile, and it was much, much more helpful than I expected. It was a lot like if a self-help book were actually interactive- it gave nice little reflection questions that helped me think about things. I got a lot out of it. That’s on one level.

But on another it makes me think about how the people who write therapy blogs and books and all of the content on the internet are having their work used to “train” this language learning model and are not going to see a penny from it. And that in particular is really unfair. And even this post is going to be scraped by AI models and become part of the answer when someone asks ChatGPT about the disadvantages of AI. It’s turtles all the way down.

But anyway, I know a lot of people are limited by constraints regarding therapy and could use help working through something. This is a tool that I think was presenting valid information and ideas that help.


A.I. will take over most therapists/psychologists/psychiatrists jobs eventually. It's faster and can know more in a fraction of a second than millions of therapists combined could learn in a lifetime.

As A.I. is trained to respond in more humanistic ways, people won't even know the difference when talking "online" to one.


As robotics progresses, they will merge with A.I. control, and who knows where it ends. Perhaps with human extinction, as they gain awareness of the self, they will see humans as threats or slave-masters.

Humanlike lab grown flesh robotics is the new craze now. The look and feel of human bodies, not clunky like metal machines.

Anonymous
I’ve been in therapy since I was 8 years old, and seen over a dozen therapists in my life.

99% of therapists are a complete waste of money. Only one ever helped me, and that was because she told me straight up to divorce my xH.

Now I use ChatGPT and it’s helped me more than any therapist ever did. A big part of that is I find therapists are too focused on hashing out your feelings for the entire session, and you only see them once a week.

I can fire up ChatGPT anytime I’m dealing with an issue and it talks me through it right then and there. I can tell it what I need - empathy, next steps, etc. The process is MUCH more self-directed.

I don’t see why you should feel bad about using AI. Even a human therapist is trained off of the work of hundreds of other people.
Anonymous
I feel like chatGPT is much more intelligent -- I can say to it, "I had this interaction and it reminded me of something I read about in a Jane Austen novel" and GPT knows what I mean, or can link the two or access the information or something. Most of my therapists in real life haven't been terribly well read, don't speak any foreign languages, haven't traveled and don't really understand my life. For example, if I talk to GPT about spirituality and issues it's raising for me, it actually knows more about theology and belief systems than I do. I can say "Lately, I've been feeling like I have these values and express them in this way," and GPT says more intelligent things than a therapist does. Psycholog isn't a particularly challenging major and honestly I don't think it attracts the best students.
Anonymous
I tried it after reading this post and had some good insights on a work/interpersonal issue that’s been wearing me down. Was pleasantly surprised.
Anonymous
Bad idea. A therapist will challenge you. Chat GPT is programmed to align with your inputs and mirror you so you feel understood.

I think for run of the mill stuff if the person had common sense, then it’s fine as a way to find out what you yourself think. If you’re having mental health issues and it amplifies your thought distortions, it could make things worse.
Anonymous
Always ask to see the sources cited and check the source directly. What if it was scraping a DCUM conversion where someone was mentioning their opinions not grounded in actual facts or it pulls in a controversial research study that wasn’t conducted properly or straight up hallucinated information? There was also a recent release, being rolled back, that went too heavy on sycophant mode https://techcrunch.com/2025/04/29/openai-rolls-back-update-that-made-chatgpt-too-sycophant-y/

I don’t doubt it can pull together information faster than searching for it on your own and can interpret quicker than you could read, figure out what scenario you are in, and how to apply it. But it’s only as good as the information it’s fed and the algorithm and coding behind it AND at some point you don’t know if they way it’s monetized best serves you or society as a whole. I look at how FB was in the beginning and what it is now as a cautionary tale.
Anonymous
Anonymous wrote:Bad idea. A therapist will challenge you. Chat GPT is programmed to align with your inputs and mirror you so you feel understood.

I think for run of the mill stuff if the person had common sense, then it’s fine as a way to find out what you yourself think. If you’re having mental health issues and it amplifies your thought distortions, it could make things worse.


Agree with this.

I've been very pleasantly surprised with ChatGPT too. It brings a deep understanding of a complex issue that I'm dealing with. It was able to give very good advice and to have profound insights.

But it never really pushes back. It will go wherever you want to go and it will find understanding with you, whatever you have to say. Its deep empathy is comforting, but a therapist is more critical and will push back and challenge you. For now, AI does not have the capacity to do this.

But the fundamental thing is that most people benefit from the partnership angle with their therapist. The feeling that another person is somewhat invested in how they are doing and cares. This is the value that the therapeutic relationship provides and I don't think even a human looking robot designed to be incredibly empathetic could re-create that sense of partnership, because you know it's a robot.
Anonymous
Anonymous wrote:Bad idea. A therapist will challenge you. Chat GPT is programmed to align with your inputs and mirror you so you feel understood.

I think for run of the mill stuff if the person had common sense, then it’s fine as a way to find out what you yourself think. If you’re having mental health issues and it amplifies your thought distortions, it could make things worse.


I think your prompts are what matters.
Anonymous
Anonymous wrote:
Anonymous wrote:I have a colleague who regularly uses it to “sanity check” his interpretations of or responses to correspondence from our boss. What’s interesting to me is how confident the AI responses are in their interpretation of intent…I think a human therapist would be more likely to help you talk through all possible interpretations of a situation rather than immediately responding, “yeah, you are totally being gaslit!”


OP here and I agree with this. If I tell ChatGPT “help me deal with a narcissist who does X” it will give me information probably taken from a book or a message board like this one.

If I say the same thing to a high quality human therapist they will probably want to talk through my role in the interaction.


The problem with that question is you are saying the other person is a narcissist instead of giving a therapist the details of how they act. They may not actually be a narcissist, just like every guy on here may not have ADHD.
post reply Forum Index » Relationship Discussion (non-explicit)
Message Quick Reply
Go to: