Better name

Responsible people are quite properly worrying about AI’s habit of hallucinating.

We’d have a clearer conversation if we called it dreaming. Everyone dreams. The process is the same whether it happens in sleep or drug-induced haze or ChatGPT.

A system assembles facts and images and sensations in ways that don’t occur in real life.

We’re familiar with dreams and we know their proper use. We can use the weird combinations to trigger a practical invention or a change in habits. We can’t treat the weird combinations in dreams as history.

Before ChatGPT we had a variety of external dream generators. Tarot, I Ching, *mancies of all types. Take a set of objects with pre-assigned meanings. Toss the objects into a random pattern and read the pattern of the assigned meanings from the objects.

We also use objects called poets and painters and playwrights, and we appreciate them most when they assemble ideas randomly and productively.

ChatGPT is software Tarot, software poets. It can be used to trigger inventions or changes in life, but it CAN’T be treated as a factual account of history.

The most important part of any creative process is the editing. When a designer or poet is allowed to run free, with no budget constraints or editing, the result is uniformly horrible. Good results come from good editing.