What can LLMs never do?
On goal drift and lower reliability. Or, why can't LLMs play Conway's Game Of Life?
What a stupid article that goes on for too long trying to treat an LLM as a thinking computer, unbelievably long, idiotic, and based entirely on an assumption that has never been and never will be true.
I bet it was written by AI.