avatar
Jack Aidley @jackaidley.bsky.social

There's a grain if truth in that description, but it's misleading enough that I think it shouldn't be used. It is not calculating the likelihood that one word follows another or even that one word follows the twenty or ten thousand before.

aug 12, 2025, 5:42 pm • 0 0

Replies

avatar
Jack Aidley @jackaidley.bsky.social

The best explanation I've seen is a sequence of half hour videos, which are great but I hate coming back to people asking for explanations with that kind of answer. They're the kind of thing that just easy that amenable to simple explanations, I guess.

aug 12, 2025, 5:42 pm • 1 0 • view
avatar
Jack Aidley @jackaidley.bsky.social

Those are videos by 3brown1blue on YouTube, if you are interested.

aug 12, 2025, 5:44 pm • 2 0 • view
avatar
Nick Drage @sonofsuntzu.bsky.social

I mean, a short paragraph would have been lovely - but only if the subject will fit. Any specific videos of 3B1B's to start with, or just any of theirs on the topic? And thank you for a considered answer, much appreciated :)

aug 12, 2025, 8:05 pm • 1 0 • view
avatar
Jack Aidley @jackaidley.bsky.social

He's done a number on AI and LLMs. The key bit to understand is what the Transformer does. Usually I'd be more specific but I'm on holiday this week and only have my phone with me.

aug 12, 2025, 8:11 pm • 1 0 • view
avatar
Nick Drage @sonofsuntzu.bsky.social

Thank you... meanwhile I hope you have a great time 🙂

aug 12, 2025, 8:15 pm • 0 0 • view