LLMs are a mathematical model of language tokens. You give a LLM text, and it will give you a mathematically plausible response to that text. LLMs are not brains and do not meaningfully share any of the mechanisms that animals or people use to reason or think. softwarecrisis.dev/letters/llme...