I’m not arguing about empathy itself. I’m arguing that technology is entirely incapable of genuine empathy on its own.
“AI”, in the most basic definition, is nothing more than a program running on a computer. That computer might be made of many, many computers with a shitton of processing power, but the principle is the same. It, like every other kind of technology out there, is only capable of doing what it’s programmed to do. And genuine empathy cannot be programmed. Because genuine empathy is not logical.
You can argue against this until you’re blue in the face. But it will not make true the fact that computers do not have human feelings.
Actually, a lot of non LLM AI development, (and even LLMs, in a sense) is based very fundamentally on concepts of negative and positive reinforcement.
In such situations… pain and pleasure are essentially the scoring rubrics for a generated strategy, and fairly often, in group scenarios… something resembling mutual trust, concern for others, ‘empathy’ arises as a stable strategy, especially if agents can detect or are made aware of the pain or pleasure of other agents, and if goals require cooperation to achieve with more success.
This really shouldn’t be surprising… as our own human (mamallian really) empathy fundamentally just is a biological sort of ‘answer’ to the same sort of ‘question.’
It is actually quite possible to base an AI more fundamentally off of a simulation of empathy, than a simulation of expansive knowledge.
Unfortunately, the people in charge of throwing human money at LLM AI are all largely narcissistic sociopaths… so of course they chose to emulate themselves, not the basic human empathy that their lack.
Their wealth only exists and is maintained by their construction and refinement of elaborate systems of confusing, destroying, and misdirecting the broad empathy of normal humans.
Yes, they’re all computer programs, no, they’re not all as spectacularly energy, water and money intensive, as reliant on mass plagiarism as LLMs.
AI is a much, much more varied field of research than just LLMs… or, well, rather, it was, untill the entire industry decided to go all in on what 5 years ago was just one of many, many, radically different approaches, such that people now basically just think AI and LLM are the same thing.
This is the comment that started this entire chain:
I refuse to participate in this. I love all robots.
And that’s totally not because AI will read every comment on the Internet someday to determine who lives and who does not in future robotic society.
I made an equally tongue-in-cheek comment in response, and apparently people took that personally, leading up to personal attacks. You can fuck right off.
Well, that’s a bad argument, this is all a guess on your part that is impossible to prove, you don’t know how empathy or the human brain work, so you don’t know it isn’t computable, if you can explain these things in detail, enjoy your nobel prize. Until then what you’re saying is baseless conjecture with pre-baked assumptions that the human brain is special.
conversely I can’t prove that it is computable, sure, but you’re asserting those feelings you have as facts.
My dude.
I’m not arguing about empathy itself. I’m arguing that technology is entirely incapable of genuine empathy on its own.
“AI”, in the most basic definition, is nothing more than a program running on a computer. That computer might be made of many, many computers with a shitton of processing power, but the principle is the same. It, like every other kind of technology out there, is only capable of doing what it’s programmed to do. And genuine empathy cannot be programmed. Because genuine empathy is not logical.
You can argue against this until you’re blue in the face. But it will not make true the fact that computers do not have human feelings.
Actually, a lot of non LLM AI development, (and even LLMs, in a sense) is based very fundamentally on concepts of negative and positive reinforcement.
In such situations… pain and pleasure are essentially the scoring rubrics for a generated strategy, and fairly often, in group scenarios… something resembling mutual trust, concern for others, ‘empathy’ arises as a stable strategy, especially if agents can detect or are made aware of the pain or pleasure of other agents, and if goals require cooperation to achieve with more success.
This really shouldn’t be surprising… as our own human (mamallian really) empathy fundamentally just is a biological sort of ‘answer’ to the same sort of ‘question.’
It is actually quite possible to base an AI more fundamentally off of a simulation of empathy, than a simulation of expansive knowledge.
Unfortunately, the people in charge of throwing human money at LLM AI are all largely narcissistic sociopaths… so of course they chose to emulate themselves, not the basic human empathy that their lack.
Their wealth only exists and is maintained by their construction and refinement of elaborate systems of confusing, destroying, and misdirecting the broad empathy of normal humans.
At the end of the day, LLM/AI/ML/etc is still just a glorified computer program. It also happens to be absolutely terrible for the environment.
Insert “fraction of our power” meme here
Yes, they’re all computer programs, no, they’re not all as spectacularly energy, water and money intensive, as reliant on mass plagiarism as LLMs.
AI is a much, much more varied field of research than just LLMs… or, well, rather, it was, untill the entire industry decided to go all in on what 5 years ago was just one of many, many, radically different approaches, such that people now basically just think AI and LLM are the same thing.
I don’t care if it’s genuine or not. Computers can definately mimic empathy and can be programmed to do so.
When you watch a movie you’re not watching people genuinely fight/struggle/fall in love, but it mimics it well enough.
Jesus fucking christ on a bike. You people are dense.
Removed by mod
What the fuck is the jump to personal attacks?
This is the comment that started this entire chain:
I made an equally tongue-in-cheek comment in response, and apparently people took that personally, leading up to personal attacks. You can fuck right off.
You mean like: “Jesus fucking christ on a bike. You people are dense.” ?
Well, that’s a bad argument, this is all a guess on your part that is impossible to prove, you don’t know how empathy or the human brain work, so you don’t know it isn’t computable, if you can explain these things in detail, enjoy your nobel prize. Until then what you’re saying is baseless conjecture with pre-baked assumptions that the human brain is special.
conversely I can’t prove that it is computable, sure, but you’re asserting those feelings you have as facts.
You:
That’s pathetic.