I train AI on the side and emotional support has been a big focus recently. I can’t say much because of my NDA but AI is really growing past basic data and probability. Experts doing human reinforcement training are a big part of that. I haven’t been able to contribute like that to psychology focused projects but I do with my field and it’s SO cool to see it impacting the end user!!
no i’m telling you as someone with a degree in computer engineering in grad school for a second that its data munging processes don’t use weightings like a person with critical thinking would, so technically correct or clinically valid sources have the same individual weight as the million and a half people on social media who have no idea what they’re talking about, and by sheer volume they get outweighed.
Definitely be careful with it because its primary goal is to please the user, so you’ll get wrapped up in confirmation bias if you’re not careful. Prompt it to play devil’s advocate occasionally and of course think critically about your conversations with it before acting on them. And never feed it PII even if you have the “use for training” setting turned off.
Do you also say that to people teaching their grandparents how to use a smartphone? I’m teaching someone to use a technology. It’s not a medical technology. It’s very much in my expertise. If they want to use it for medical stuff, that’s on them. It’s a tool like any other. People are gonna use how they want to. I’ll provide safety tips thanks.