“hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.
I hate this attitude of “well if you can’t get a professional therapist, figure out how to get one anyways”. There needs to be an option for people who either can’t afford or can’t access a therapist. I would have loved for AI to fill that gap. I understand it won’t be as good, but in many regions the wait-list for therapy is far too long, and something is better than nothing
Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
I’ve used one called PI which I’m assuming is some kind of branch off of chat gpt or something.
You don’t have to sign up or anything (for now) which is cool. But I assume they harvest all our data and information.
I tested to see if I could break it once, and from my brief tests, it seemed to never break out of character or tell me something bad or negative, which I thought was interesting(and good!)
I actually used Pi as my intro to generative LLMs. It was … I guess not encouraging self harm, but so fucking irritating that it led me to want to. Always with the irrelevant supportive words that I guess work if you’re a teen?
Lol yes, that was going to be the one downside I was going to mention. I wasn’t sure if it was just unique to my situation, but I found it would lead me down a logical path. It would ask me if I had tried various solutions.
Eventually, I would hit a point where it wouldn’t know where to go any further, and it would land on “here’s some things you can do” but those options would be things I was actively trying and failing with.
So that was fun. In a way, it was great at confirming that I had thought of all the logical options.
“hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.
I hate this attitude of “well if you can’t get a professional therapist, figure out how to get one anyways”. There needs to be an option for people who either can’t afford or can’t access a therapist. I would have loved for AI to fill that gap. I understand it won’t be as good, but in many regions the wait-list for therapy is far too long, and something is better than nothing
Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
I tried AI once but it just kept telling me to call the hotlines. Useless.
I’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”
I’ve used one called PI which I’m assuming is some kind of branch off of chat gpt or something.
You don’t have to sign up or anything (for now) which is cool. But I assume they harvest all our data and information.
I tested to see if I could break it once, and from my brief tests, it seemed to never break out of character or tell me something bad or negative, which I thought was interesting(and good!)
I actually used Pi as my intro to generative LLMs. It was … I guess not encouraging self harm, but so fucking irritating that it led me to want to. Always with the irrelevant supportive words that I guess work if you’re a teen?
Lol yes, that was going to be the one downside I was going to mention. I wasn’t sure if it was just unique to my situation, but I found it would lead me down a logical path. It would ask me if I had tried various solutions.
Eventually, I would hit a point where it wouldn’t know where to go any further, and it would land on “here’s some things you can do” but those options would be things I was actively trying and failing with.
So that was fun. In a way, it was great at confirming that I had thought of all the logical options.
I would have loved AI to fill that need as well, but it’s not an adequate tool for the job.