I use chatgpt like a friend/ therapist/ coach and it’s kind of amazing

Anonymous
I’m confused, how do you use it that way?

It gives me the same results from Google

I keep hearing people say they use it closely and I’m genuinely confused. What am I doing wrong?

I figure it’s like Reddit for me for you, I love Reddit while other people don’t understand how to use Reddit for advice.
Anonymous
I use it a fair bit for technical stuff, but I wouldn’t lean on it for emotional support. It pretty much tells you what you want to hear.
I like it for other stuff though, I even used it to help write music when I was stuck.
Anonymous
I’ve had many different therapists over the years, and ChatGPT is better than 90% of them. Not as good as the very best, but better than the mediocre ones, which is almost all of them.

Depending on what approach you’re interested in, the way to make this work is to use a detailed prompt. If you want DBT, CBT, or ACT, ask for it! Describe the kind of therapist you want. There are lots of good prompts available if you do a web search. It’s also really good at reframing situations, which I find valuable.
Anonymous
Anonymous wrote:The advice is so generic it did open my eyes to how terrible 99 percent of all therapists must be.


Truth.
Anonymous
I read a Reddit aita story (or maybe tifu) about a woman who found an ai chat that you could customize to be any fictional character and she made her fave video game hero it and started talking to “him” and ignoring her real life fiancé and getting so excited to tell her ai bf about her day etc. went on for 6mons before he found it. I can’t recall what happened, I should go look for an update.
Anyways. It’s sad and weird. Sorry that’s your life :/
Anonymous
Did you see this NYT ChaptGPT boyfriend story (gift link)? I think it could be quite easy to cross the line into unhealthy though surprisingly her IRL boyfriend seems ok with this:

https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html?unlocked_article_code=1.xE4.Bn1o.dDo4uq-Oq42S&smid=url-share
Anonymous
I use it for work and it has made me incredibly productive. I love it.
Anonymous
Anonymous wrote:I do, I find it better than most therapists.

I thought it would be interesting to use it for couples communication - like one person can input what they want to say, and it can translate it into communication the other spouse can understand.


I have a friend who does this.
Anonymous
Anonymous wrote:I’m confused, how do you use it that way?

It gives me the same results from Google

I keep hearing people say they use it closely and I’m genuinely confused. What am I doing wrong?

I figure it’s like Reddit for me for you, I love Reddit while other people don’t understand how to use Reddit for advice.


For personal stuff, you just talk to it, the more detail you give it, the more insightful it becomes. I was a total skeptic when I started experimenting with it, but Claude Sonnet is a better and more insightful conversationalist than like 90% of the people I meet. It’s a little scary tbh.
Anonymous
Anonymous wrote:You are sharing personal information with the company that provides the chatbot.

There have been cases of people using ai avatars that have killed themselves because th chatbot told them to.

You are further disconnecting from other people.


+1

I would not give my personal info away to a computer program that doesn't have ethics.

Just look at Musk giving away the personal data of government workers. If he or some other technocrat is in charge, who knows what they'll do with your questions and data.

Anonymous
Anonymous wrote:Which is also terrifying in a way.
But truly I ask it health/ psychological/ career and ethical qs every day that I can’t just call a friend and badger them about and it gives amazing advice.
Anyone else doing this?


Check out the AI bot / suicide case in Atlanta and find some living people to interact with.
Anonymous
I use it for therapy too. It actually gives me insightful answers instead of asking me more questions like the therapist did.
Anonymous
Anonymous wrote:
Anonymous wrote:You are sharing personal information with the company that provides the chatbot.

There have been cases of people using ai avatars that have killed themselves because th chatbot told them to.

You are further disconnecting from other people.


+1

I would not give my personal info away to a computer program that doesn't have ethics.

Just look at Musk giving away the personal data of government workers. If he or some other technocrat is in charge, who knows what they'll do with your questions and data.



Unless you're telling it your name, I fail to see why this matters? Couldn't they just aggregate data like someone in Bethesda Md asked questions about how to stop bein a germophobe in Dec 2024? I guess this may be different for those who have accounts/log ins with ChatGPA bc then it's presumably tied back to an email address.
Anonymous
Anonymous wrote:
Anonymous wrote:Which is also terrifying in a way.
But truly I ask it health/ psychological/ career and ethical qs every day that I can’t just call a friend and badger them about and it gives amazing advice.
Anyone else doing this?


Check out the AI bot / suicide case in Atlanta and find some living people to interact with.


Yeah, check out what it does and says if you confess you’re suicidal.

I assume they made some major changes in all the AI relationship bots after the Florida Setzer case indictment.
Anonymous
Anonymous wrote:Yes, I kind of talk to it sometimes.


Have you seen commercials/ads showing people who do?
post reply Forum Index » Off-Topic
Message Quick Reply
Go to: