Dammit, it’s not the output 2

Last week I noted that people were trying to sound “more human” by varying their vocabulary and grammar. I predicted that Altman’s coders would quickly counteract this tactic by adding more variation. A couple days later a Verge item revealed that Altman’s coders were already working on it.

I’m more concerned with the input theft side than the processed outputs of the thefts. Altman’s coders should work on their theft patterns as well. The bots that steal material are rigorously mechanical, hitting each source exactly the same number of times.

Living animals seek novelty and change. It’s built into every layer of our nervous systems. We ignore regularity and notice changed patterns SHARPLY.

Computers aren’t alive, so they don’t naturally seek change. They mechanically perform the same action as many times as the loop specifies. This has always been the value of MACHINES, from printing presses to assembly-line robots. Perfect regularity, every page or piston exactly the same.

I haven’t touched or changed my previous Blogspot blog since switching to WordPress at the end of 2021. There’s nothing new or recent, nothing salient. But the old blog still gets a steady flow of PERFECTLY UNIFORM “reads” every day. Here’s a small part of the list, which goes on pretty much forever. The topics of these posts have nothing in common, so they wouldn’t be the result of a real search. I haven’t linked any of them here or in comments elsewhere.

If the AI thieves want their thefts to be less obvious, they should try for more randomness. It’s not hard to get lifelike randomness in a program. It doesn’t take AI, it just takes an understanding of living patterns.

As with the output, I’m sure Altman’s coders are working on this as well.