Some people are trying to emulate AI by thinking as AI thinks.
In today’s column, I examine a newly emerging trend involving people opting to change how they think by emulating the way that AI thinks.
Actually, for clarification, it is a misnomer to say that AI can “think” – this is an anthropomorphism of what AI is computationally and mathematically doing. In any case, many people believe or seem to think that AI thinks, so I am going to carry forward that premise in this discussion (please realize that the premise is not strictly valid).
A further complication is that there are those people who comprehend the overall computational and mathematical processes underlying generative AI and large language models (LLMs), while there are other people who don’t have the faintest clue of what is happening under the hood. Thus, of those trying to emulate the way that AI thinks, there are ones that aim to do as AI currently is built to do, and there is a second segment that uses whatever wild mental contrivance they think of when it comes to how contemporary AI actually functions.
All told, the idea simply stated is that there are people who somewhat admire or respect the way that generative AI and LLMs get things done, and they want to be the same. They desire to think like AI. One motivation is that this might improve their existing thinking processes. Another is that it would simply be cool to emulate AI. Lots of reasons exist.











