I'm sorry, I know that by even attempting to disagree with anything you say in this article, I come across the risk of sounding like a hateful misogynist. But I simply don't understand the world around me anymore.

On the one hand, there's a public outcry for (and from) women to have more respect, to be treated as equals, to not be constantly objectified and to be given the decency that they deserve as fellow human beings. In working places, within social groups, in places of worship, everywhere. And rightfully so. Men have been treating women as their own property for decades and it needs to stop.

But then I take a gander at my social media and see women - even from within my own social environments - posting provocative photos, where they pose half naked and have highly suggestive captions and all I can think of is "who is this for?" Yes, we all want to show off our nice bodies, especially if we make an effort to obtain them, following a strict daily routine of exercise and proper nutrition, but not one of my male friends has ever posted a photo of his butt on their socials, not even as a joke! It's not something we would think of doing.

Women definitely have a right to their bodies and they should have the freedom to do whatever they want with them without judgment. I'm only saying, I don't understand. Why is there such a need for exposure? For who's benefit? Social media have been oversaturated with photos of half naked girls. Why is this a matter of pride? Please tell me, I'm not judging. I'm only attempting to understand.

Additionally, and this is an issue that delves a little deeper into what is said within the article, I've noticed a lot of women complaining about the treatment they get from men, yet the guys they choose to fall for or keep around them are very obviously going to treat them like that.

Personally, all my life, I've tried to treat women with respect and I never overstepped my boundaries, yet most of the time I would sit back and watch them walk around with some bro that didn't even care about them and spoke about girls as if they were trash. Again, why?

In my eyes, it's as if women have this compulsive need to search for men with bad behavior and change them, mold them, make them fit into this ideal image that they have in their mind. All this while there are already perfectly good people out there who are willing to give them all they need, without thinking of them as a piece of meat or an accessory that they can play with whenever they feel like.

I was only going to write a couple of paragraphs and it turned out to be a whole article. Sorry about that! But if you could answer my questions and help me see things from a woman's perspective in order for me to be a better man - and not just slam me as a misogynist - I would deeply appreciate it!

Love podcasts or audiobooks? Learn on the go with our new app.