All software can be altered in a million ways. If the developers can't figure this out, they are crappy developers. More likely they weren't asked to figure it out.
All software can be altered in a million ways. If the developers can't figure this out, they are crappy developers. More likely they weren't asked to figure it out.
LLMs aren’t doing what many people think they are doing. They never “hallucinate “ they just produce a response that plausibly sounds like the next thing that would happen in a conversation. SOUNDS LIKE, not actually understanding anything. So it reflects back what the kid’s been obsessing about.+
It gives the plausible responses until the kid gets what he’s looking for… There might be a million changes or filters the coders might try, but human discourse has trillions more variations,