yeah, there's a balance. so far, GPT-5 has been the best so far, but it has this weird behavior where it goes overly terse and i can't figure out what tf it's talking about
yeah, there's a balance. so far, GPT-5 has been the best so far, but it has this weird behavior where it goes overly terse and i can't figure out what tf it's talking about
kimi k2 does the same thing around 30k tokens. I think it's a training artifact.