For me it's sad that chatting with AI is a best thing we can do for people suffering from something as simple as a feeling. All those primary feelings are just brain chemistry. We should be able to fix that so that people don't feel lonely regardless of whether they are under the impression that they have meaningful connections in their lives with other people or not. We have a long tradition of thinking of suffering as necessary motivation tool. I think we should grow out of that philosophy as a species.
I'm biased since I live this philosophy but it sort of works for me, I've trained myself to realize these sorts of services operate by "interfacing" by emitting strings of words that sound appealing to the mind that consumes them. Look at virtual YouTubers for another example, it's the experience of watching people act "genuine" like casual friends because it activates the same social neurons, hence each month the actors receive a paycheck
The question becomes more complicated when the corpos are done away with and you have the model running on your own machine. The entity I tend to trust the most in life is raw, unfiltered silicon. When I program, it does exactly what I ask it to, to a fault. It doesn't turn its back on you because you said something stupid or you don't agree with an opinion it holds
But is an LLM repeating the tokens "you are worthy of this life" and its many variants thousands of times fundamentally any better even if you're in control? It strikes me as one step closer to wireheading honestly. But that's true for a lot of things like video games, just to a much more distant extent, it's all data in the end
But interest and anxiety have gotten the better of me when it comes to meeting strangers, so I'm at a loss. Maybe the solution for people like me is to just ignore all those things in the category of social or substitute and channel my efforts into other hobbies instead of continuing to associate stress with communication. Or intellectualize endlessly, like in this post, to prevent any emotional bond from easily forming when people read it
I don't like YouTubers, I don't like celebrities, I don't like personifying LLMs. Any parasocial relationship is toxic and to be avoided.
I've had an LLM tell me that suicide seemed logical when recounting a low point in my life to it where I'd previously considered it after several people around me of vastly different ages and backgrounds had killed themselves following a shared trauma. These things aren't conditional logic and can fail in all kinds of bizarre and spectacular ways, they're no substitute for an actual human being trained to handle these kinds of delicate things.
What one patient calls x, another person will call y. And when discussing the human mind, trauma et al, especially when things happened, especially over the course of months or years, an LLM simply isn't going to be able to compete with a human being. Equally, physical tells are important too. A therapist might ask a patient if they take drugs, the patient may claim they don't, but the track marks, tone, body language and overall behaviour might tell the therapist something important that they won't even necessarily convey to their patient, but keep in mind.
Wishing you well, stranger. I hope you eat something good today and find something intellectually stimulating to amuse yourself with, even temporarily, and hopefully discover something new and positive before bed.
> But interest and anxiety have gotten the better of me when it comes to meeting strangers, so I'm at a loss.
Getting involved in my community doing different kinds of volunteer work as well as activities I find fun has been the best way to get to know strangers and make new friends. It's easy to make a new friend when the two of you share a couple of passions!
Putting anxieties aside and being more outgoing and having enjoyable small-talk conversations was not a skill that came naturally to me, but rather was one that needed to be developed over time.
It's a skill and the I think the benefits of it are well worth the effort to develop it.
Good question. There's obviously a line somewhere. But I think it's ultimately almost always down to what's technically achievable. Poverty causes suffering in so many ways. I don't think we'll every be able to fix all of them without fixing poverty itself. My point is not to make someone happy by medicating him with a drug causing him more harm. Suffering people already do that with actual "recreational" drugs. And we have a lot of data on the side effects of those substances and additional harm they do, not because they alleviate suffering, but because of all other effects they have.