Do they teach history at schools in the US?
Do they teach history at schools in the US?
Yes. But only to students who actually attend. Obviously neither Vance nor Trump ever saw the inside of a classroom.
They teach that WWI started in 1917, and WWII started in 1941.
Yes, currently a devolving, erasure-focused, approximation of a history.
He knows the truth, he lies. He says he makes up stories to support what he wants the base to hear.
Wait … you all have schools?
I thought American schools were forts.
Soon to be churches
Nah. Can't possibly.
Of course they do. They wouldn't have any easy places for mass shootings if they didn't.
Given son of the prominent talking points of a major party these days.... I cannot take offense at this dig 🤷♀️
Also schools without shooter drills
We don't even have litter boxes.
We have teachers who try their best with absolutely nothing.
Silly man… history, especially US history is too woke to be taught anymore. So glad I went through our public schools when they were still something to be proud of (1960’s/70’s).
Nobody remembers anything from school because we're so deeply incurious we never consider this history again once we get through the test
We used to, this clown went to Harvard.
Sorry *Yale. They should revoke that degree.
Nope.
Yeah, but Republicans are sociopaths. It doesn't bother them morally to lie about everything.
They do, he's just trying to lie and sane wash appeasement.
US high school history books are very clear about Germany & Japan being defeated & surrendering unconditionally.
Then JD Vance was been skipping some classes to uh 'sitting' on his couch
has #autocorrect
That could explain it.
He also "forgets' what a little bitch he was to Zelensky.
Not really, or you'd have heard about Somerset v. Stewart and realized some of the American colonies were on the wrong side. WW I was basically all parties were assholes, and no colonies were freed.
Yes, but they've burned the text books.