Several articles are noting ChatGPT’s LIBELOUS habit of mixing and matching biographies. For instance, Brian Moore, an Australian mayor, is suing OpenAI for slander. In 2008 he worked in banking, and reported a money-laundering scheme to the authorities. ChatGPT insists with the full weight of authority that he was the CRIMINAL, not the WHISTLEBLOWER. This is an inexcusable and unforgivable error.

Google’s search AI mastered sentence understanding 10 years ago. Google always knows who you mean and what you mean, provided you give just enough detail. If you fully specify an unrelated Brian Moore, you will get an unrelated answer. If you specify this Brian Moore, currently mayor of this Australian city, you will get the publicly known facts about this Brian Moore. You certainly won’t get an exact reversal of the facts.

It takes special effort to turn facts upside down.

False accusation is a standard part of witch-hunting, and OpenAI is unquestionably a highly refined weapon.

Later: This reminds me of the final scene in Walker Percy’s 1971 dystopia ‘Love in the Ruins’, which is EXACTLY about a medical holocaust aimed mainly at the elderly. On the fucking dot. In the final scene, Satan is passing out specially modified iPhones that seem to be jumbling everything up in a funny way, but always “just happen” to create slander and fights and riots. Percy’s phones include direct thought control, a feature that isn’t quite here yet. Elon is working on it.

%d bloggers like this: