avatar
Yoav Goldberg @yoavgo.bsky.social

prove me wrong by listing linear algebra topics / results that are central (or even just important) for ML, modern or otherwise.

jul 3, 2025, 6:42 pm • 2 0

Replies

avatar
Gerard Burnside @gerardburnside.bsky.social

LoRa finetuning uses Low Rank matrices which I'd put in the Linear Algebra toolbox, it's not central though.

jul 3, 2025, 9:52 pm • 1 0 • view
avatar
Yoav Goldberg @yoavgo.bsky.social

LoRA can be described without discussing ranks at all, and it would be just as effective. it doesn't rely on any property of rank to work.

jul 3, 2025, 10:01 pm • 0 0 • view
avatar
fry69 @fry69.dev

Well, ChatGPT has an opinion on this -> chatgpt.com/share/6866d1...

Every component of modern ML—from simple linear regression up through deep neural networks and kernel methods—depends critically on linear‑algebraic ideas: Data lives in vector spaces. Models apply linear maps (matrices/tensors). Training uses gradients (matrix calculus). Dimensionality reduction and feature decorrelation use eigen/SVD. Regularization and similarity metrics derive from inner products and norms. Thus the assertion “Modern ML is not built on linear algebra” is easily refuted: linear algebra supplies the very language, computations, and theoretical underpinnings for all mainstream ML methods.
jul 3, 2025, 6:52 pm • 0 0 • view
avatar
Yoav Goldberg @yoavgo.bsky.social

it is wrong.

jul 3, 2025, 7:01 pm • 0 0 • view
avatar
fry69 @fry69.dev

Seems pretty reasonable what ChatGPT writes. Also that is what I learned from Andrej Karpathy's videos how LLMs work -> www.youtube.com/@AndrejKarpa... So I motivated to give your "it is wrong" not much credence.

jul 3, 2025, 7:04 pm • 0 0 • view