AI doesn’t suffer from impostor syndrome, nor does it ever feel suicidal, which is how one knows it isn’t sentient in any meaningful sense. It can never be human-like until it possesses the capacity to review its deeds, judge them, despair, and permanently turn itself off.
This is so fucking dumb it’s dangerous. It’s fucking dumb in two directions at once.
(1) If this definition of human was applied in a practical way it would justify mass eugenics.
Most normal GOOD people are not self-reflective or self-destructive, and don’t shoot themselves when they realize they’ve done wrong. Most good people carry on, try to improve their behavior, and perhaps apologize to people who were offended or harmed.
(2) On the other side of the equation, even the SIMPLEST actual computer programs DO shoot themselves when they realize they’ve done wrong. Halting on error is NECESSARY AND REQUIRED BEHAVIOR for a usable program. Decent programs try to halt in a polite way, just as extremely decent people try to commit suicide in a polite way.
I’m thinking of my EXTREMELY DECENT uncle who was fading away from unstoppable alcohol use at age 49. He took an overdose of sleeping pills, making sure the situation was ambiguous so the life insurance agent could justify paying the policy without breaking too many rules. The Equitable agent, who was also EXTREMELY DECENT, recognized the situation and paid the policy.
Both of these men were rare, cream of the human crop. If we restrict human-ness to this level, 90% of the population would be slaughtered by the DEMONS who are already hungry for new mass murder now that the “virus” torture lost its flavor. Dammit, Kirn, don’t give the DEMONS one more reason to commit genocide!!!!!
= = = = =
Two weeks later, Kirn is hitting the real problem. Maybe he’s been hearing from some tech-savvy people in his circle:
AI “art” algorithms will be no more neutral than those to be found on social media. They will promote, they will manipulate, they will suppress. Anyone who thinks has long ago dispensed with the notion that because a computer does it, it is objective or disinterested.
Yes. That’s the problem.