LLMs try to make sense of the world. They reinforce the consistent ideas and have problems "remembering" the inconsistent ones. A lot of right-wing ideology depends on doublethink and inconsistencies, so it doesn't translate well to automated reasoning. It's truly human (in the "human animal" sense)