I'm actually bemused how so much of the US is portrayed as being "God-fearing" and expects women to be virgins when they're married and yet pretty much every movie, series and advertisement portrays women as sex-crazed Jezebels just waiting to seduce every man they meet. What am I missing?