100% And then if you think about a lot of the posts on here where anyone reading them critically can see what is actually going on but the OP can't even see the issue makes me question how helpful it would be for many people. Basically I think you'd get out what you put in and most people aren't reflective or self-aware enough to view all sides of a situation, which is where a therapist would be more useful. I guess my main concern would be that it could be like an echo chamber of sorts, which I feel like is a huge problem with our society right now. But I completely agree with OP's point that it could be a great, free resource for a ton of people, so I'm torn on that aspect. And yes, not all therapists are good. |
Exactly- the therapist will say “what makes you think Larla (IYKYK) is a narcissist?” Whereas chatGPT will immediately launch into “advice.” It’s very buyer beware and it’s a tool that requires smart input. You can manipulate ChatGPT into pushing back. For example, I asked it about a bad habit to experiment and listed the ways I felt the habit was a problem, and then told ChatGPT later in the interaction “but I don’t think it’s a big deal?” And ChatGPT did, in fact think it was a big deal. But I had already framed the behavior as negative initially and said why I thought so. |
| It is such a horrible idea for so many reasons. My god. |
As someone who has never received therapy, I find it surprising that a human therapist will really challenge you. I had assumed that therapists would be reluctant to push back too much for fear of losing a client? Isn’t there a financial incentive for therapists to placate their clients? |
|
ChatGPT is great in that it will give you the info you ask it to provide. But that is different from really listening to you and noticing patterns in the way you think, prejudices you might have, and perspectives that you have that may be incorrect.
Sometimes all you really need is the info you're looking for. But other times, being able to step out and notice your own perspective is what you really need, and I'm not sure how ChatGPT could give you that. |
| I have been using ChatGPT to interpret my dreams Carl Jung style. I find it entertaining but also very eye-opening. Sometimes the interpretations are quite accurate and reflect things happening in my real-life. I take it with a grain of salt but overall I do think it has been helpful. It also gives questions for reflection at the end with work just like therapy. |
|
My issue with these AI systems is that they tend to be politically correct. Political correctness isn't always the answer. For example think about any perceived prejudice against men and women and type on Google "why are men ..." Or "why are women...." The AI answers always give you an answer that is deemed politically correct or non offensive. It is as if these AI systems think that here is one particular way to think about a topic. This is scary to be honest because these systems are trained. Are some sources more weighted than others so that the algorithms are influenced?
I will.be scared to use AI as my therapist |
|
It's amazing how many people completely give up their own privacy on the internet. We've gone from stupid memes on FB like "what animal are you" that ask for your a/s/l to "just tell this computer all your juicy secrets" as if that date can't be traced to you, stored, compiled...
This site can log your IP, did you know that? There really isn't "anonymity" on the internet... |
Yes, because it's a computer and you told it you are dealing with a narcissist. It think they are diagnosed and that is a given. The human would assume you are interpreting their behavior as having narcissistic traits. These are tso totally different scenarios. Now if you describe the behavior to AI and tell AI from your perspective you interpret some of these behaviors as narcissistic and want to think through the options and consequences, you'll get a better response. AI is a servant, to it you are always correct. In reality you are often wrong. |
Yes, but you are just one of 8 billion people. Do you really think anyone cares about you in particular? You’re just not that important. |
Plus, the questions PP is asking are probably the same questions 7 other billion people are asking. It’s not like we’ll be gaining any unique info or insights. |
You can instruct it to push back if appropriate and in my experience it does this reasonably well. Ultimately, so long as you understand you are really having a technologically-mediated conversation with yourself, it can be extremely helpful. At minimum it forces you to articulate your thoughts clearly in writing rather than wrestling with half-formed ruminations or whatever. |
| I also think therapy requires nuance by the therapist. AI cannot hear tone or see body language of the person. |
+1 Right now, AI can only pull information that is on the internet. And we all know that any bozo can put stuff on the internet. It's one thing for AI to generate some line of code or marketing blurb where (hopefully) some employee reviews the AI output, but it's another to depend on AI for mental health therapy. Who is reviewing the output? I'd be worried that whatever AI produces makes things worse for the person seeking therapy. Kind of like going down the rabbit hole of conspiracy theories. |
See? Clueless... It's not about another individual sorting that data. It's about a computer, who doesn't give a single eff, being able to sort that data in a matter of minutes. |