avatar
Tim Kellogg @timkellogg.me

this is hyperparameters not data (e.g. batch size, learning rate, ..) you actually can do reverse (little -> big) distillation pretty easily. e.g. rephrasing generally doesn’t have to be a big model

aug 31, 2025, 8:42 pm • 2 0

Replies

No replies