prove me wrong by listing linear algebra topics / results that are central (or even just important) for ML, modern or otherwise.
prove me wrong by listing linear algebra topics / results that are central (or even just important) for ML, modern or otherwise.
LoRa finetuning uses Low Rank matrices which I'd put in the Linear Algebra toolbox, it's not central though.
LoRA can be described without discussing ranks at all, and it would be just as effective. it doesn't rely on any property of rank to work.
Well, ChatGPT has an opinion on this -> chatgpt.com/share/6866d1...
it is wrong.
Seems pretty reasonable what ChatGPT writes. Also that is what I learned from Andrej Karpathy's videos how LLMs work -> www.youtube.com/@AndrejKarpa... So I motivated to give your "it is wrong" not much credence.