So I moved to a large city recently and have been out or hung out with many women I have been introduced to by colleagues and friends.
I'm noticing a trend with ladies these days. They find any reason to bring up sex and that they have kissed a woman. Why do they do this?
I seem to remember when I was younger, high school age, the guys chasing after the girls talking about sex while the girls played dumb. When did things reverse themselves?