Is sex really as important in your relationship as the media has been expressing it? Ladies?
First off, I am a guy but I have to stick my nose in here. My sister is standing over my shoulder reading this and she is a therapist. She and I have the same views on this and it is probably the same as many others out there. All women and men are different but when it comes down to it, sex is important in a relationship just like physical attraction. Any person that tells you different is either blind or just lying to your face. It is the way humans were wired. It does not make you a bad person by no means but in the end sex and physical attraction will always play a significant role in how a relationship plays out. A woman has needs just like a man does and they are just as likely to cheat as a guy is. The only difference is that the woman is more careful and is less likely to get caught. Sex is always great in the beginning of a relationship because you are exploring new territories. However, if sex isn’t all that great in the beginning of a relationship… well you shouldn’t be with the guy or girl in the first place. To me, true love isn’t all about looks and sex but it does play a role.
I hope this helps.