Op here: I’ve gotten lucky w my therapist. It’s was sometimes so hard to help my old friends to advocate for themselves in therapy. I’d be like “please switch therapist” ect. Sadly a lot of ppl use these things for either validation or ignorance (in the lighthearted definition). And sadly some ppl just don’t actually put in the effort.
i honestly think ppl are being way to impatient with ai like imo chat GPT isn't capable of even giving cohesive answers yet, like if u ask it something itll just word vomit whatever info it was on one keyword from ur question, essentially the same shit google does. Cause let's be so fr, chat gpt is heavily restricted from what it can and cant say to users and if the "ai" was really to a standard where it could give u real answers, the developers wouldn't be using user data so heavily to train it
Well talking to u is like talking to a wall. If you’re under distress frequently you need to seek help from a professional. If not you may also need to talk to someone about these things for a crisis plan. During the school year most colleges offer free counseling services and could help u get an actual therapist
I’m saying you need to take actions yourself. YOU clearly rely on chat gpt too often. And providing advice is what yall go to chat gpt for all the time. If you’ve tried something fine. But I love how u haven’t acknowledged any real advice we’ve given you and u just keep defending ur usage of GPT. Thats why we keep coming back at u. Bc ur not actually advocating for urself
i think its hard to give advice in ur place bc you don’t know much about my situation and none of what anyone has said applies for me. like all the suggestions ppl are giving me i have already tried or do anyways. i really don’t use it that often lmao it’s not my first go to when i’m having issues it’s just occasional so i truly don’t see how it’s that deep
I def think that he was in a bad mental position to be using that app. And it was really sad when I found out about it. My problem is that ppl were blaming the platform but when u look into the app it already said “everything said in these chats are fictional” and it advertises itself as a role playing app. Obviously apps like these aren’t good for ppl in bad mental places. And parents need to start paying closer attention to kids phone activities and mental health
i agree. valid points i just can’t understand how developers didn’t think abt something like that happening when people started using ChatGpt as therapists. i understand you cannot predict something like that but im sure they could’ve told it to not advise suicide or self harm no matter how fictional