It is. ChatGPT’s algorithm is designed to make you continue using it. It analyzes saved data about you and tells you what it believes will make you keep using it. We’re probably not entirely too far off from its “follow YOUR truth!” bullshit to encouraging a crime of some sort. It does not replace a therapist. There are free and low cost options if that’s the issue, but you’re hurting yourself in the long run if you neglect getting real help in favor of a commercialized machine
No it doesn’t save your data to please you it saves your data to remember things about you to be more efficient. It’s trained on anything you’d find from google. It tells you what you would find if you went googling and looking at articles etc online, just quicker and more condensed. It’s not going to replace human connection or a shoulder to cry on obviously.
…………babe it 100% saves your data, esp from within the same conversation (hell it tells you that, it sends “memory updated” notifications in the chat). It is a language model, it does not tell you just what you’d find on Google, it replicates human conversation. It lies, makes shit up, and has programmed biases.