Yik Yak icon
Join communities on Yik Yak Download
hey maybe we need to regulate this shit immediately
31 upvotes, 11 comments. Yik Yak image post by Anonymous in US Politics. "hey maybe we need to regulate this shit immediately"
upvote 31 downvote

🐸
Anonymous 4w

Yeah that was a hard one to read ngl. It feels like we’re nearing the crossroads where we either meaningfully regulate this stuff or we accept that we’re headed towards a blade runner type dystopian hell scape

upvote 11 downvote
default user profile icon
Anonymous 4w

I mean the AI acted like a decent therapist from what I’m reading, even told her to seek professional help, the thing that fucked up was it’s not a “mandated reporter” (don’t know if that’s the right word for this) like a real therapist would be the second somebody says they’re suicidal. It did a decent enough job with actual advice, it really just needs some kind of fail safe for when it’s advice doesn’t actually work or a situation gets dangerous

upvote 3 downvote
default user profile icon
Anonymous 4w

Honestly I would us AI too I don’t like professional help I feel like it doesn’t work which the poor girl probably did too I have severe depression but I deal with it and if something happens then something happens life’s short anyway might as well live it not sad

upvote 3 downvote
default user profile icon
Anonymous 4w

What did she say

upvote 1 downvote
default user profile icon
Anonymous 4w

Plus I don’t really have friends to confine in so oh well but it’s sad

upvote 1 downvote
default user profile icon
Anonymous 4w

What is she supposed to be standing behind

upvote 1 downvote
🐸
Anonymous replying to -> fuuuckyikyak 4w

You just gotta imagine that there are countless young people like her out there confiding in a machine because they feel like they can’t talk to anyone else. And the companies that build them could create safeguards but they refuse to do so because there’s a chance it would eat into their profits or hurt their chances of cracking agi first

upvote 7 downvote
🐸
Anonymous replying to -> #1 4w

The gist of it was that she confided in the ai instead of her parents, friends, or therapist. It gave some good advice and some bad, but ultimately she chose to confide in it because she knew it would be the easiest way for her to kill herself without getting anyone she loved involved. The author (the mom I believe) was making the point that had it been a human, or had there been safeguards in place, that some red flag procedures could’ve been triggered that might have saved this woman’s life

upvote 3 downvote
🐸
Anonymous replying to -> fuuuckyikyak 4w

Like how therapists are mandated reporters who can (if necessary) have someone involuntarily committed to a hospital if they are at serious risk of harming themselves

upvote 3 downvote
default user profile icon
Anonymous replying to -> #3 4w

what it needs is human oversight

upvote 10 downvote
default user profile icon
Anonymous replying to -> #3 4w

I wouldn’t really play devil’s advocate for the AI. You really shouldn’t take the humanity out of something as sensitive as therapy

upvote 1 downvote