So my best friend and I had one of our regular conversations about sex and we came to the conclusion that majority of dudes think their sex game is wayyy better than it really is. I honestly think it because some men have never been told they were bad or they needed some improvement at sex. Realistically, men fuck for themselves and majority of females fuck for the man as well. Basically he gets a nut and then we're done.
I feel its our duty as women, to let these men know it wasn't good. I mean you don't have to be mean or hurtful. There are plenty ways to tell a man that his dick was wack. I can admit that I've had bad sex just because I wanted sex. But he knew that when I fuck, I fuck for me. Now when you start doing that "relationship" thing, then of course you compromise and fuck for each other's pleasure. But until then, I'll keep making sure that I get mine! But seriously do you think its wrong for a women to tell a man that his sex is bad or should she let him continue to roam the Earth with horrible dick?