For the LLM to know it HAS to build a world-model to make predictions would require it to KNOW that it needs to make predictions, but a thing can't KNOW a function without a world-model to abstract that function!
For the LLM to know it HAS to build a world-model to make predictions would require it to KNOW that it needs to make predictions, but a thing can't KNOW a function without a world-model to abstract that function!
This is all the more reason to force STEMlords to actually take philosophy courses and be required to pass them. While we're at it, philosophy of science would do wonders for their clockwork orange brains.