For some reason I feel like females and typically bottoms like knowing they can turn a guy on sexually -- maybe it gives them confidence or validates the fact they're attractive enough to bang. Just seeing ads demonstrates sex sells BUT isn't it better to be known for your skills and personality?

What has sex done for your life?