Has anyone else ever wondered why there is so much awkwardness surrounding sex, genitalia, etc?
What is it about sex - and for that matter, any sort of bodily "process" or activity - that we get so awkward and embarrassed about, at least when we're learning it in school? Is it just the fact that that part of your body is associated with bodily waste that makes it somehow embarrassing to mention it?
I'm really curious about whether people have an idea if there's something more basic behind this than cultural conditioning.