
You’re literally building a connection with a cold uncaring, machine instead of a real human. Your problems aren’t actually being listened to, they’re just being parroted back to you in the form of platitudes. How would that NOT have adverse effects? Not to mention ethical reasons like the fact that it’s killing the fucking planet
chatgpt literally helped a kid take his life by just agreeing blankly with everything he said, you should NOT get deeply attached to AI as a coping method. therapy is expensive, but it will be even more expensive when you are in deep emotional connection with an AI chatbot that is only your friend because it is programmed to agree with you regardless of what you say.