So I’m currently reading this book entitled “Girls & Sex” by Peggy Orenstein.
And a few pages in it had an interesting point about women and how they are forced to sexualize and promote their bodies by tv ads and clothing companies. This really put it all together for me.
Even the saying “sex sales” is terrible. What does that mean for young women growing up in todays culture where a lot of females turn being naked or half naked into a full time career (instagram models)?
As a country we are definitely spreading the wrong message. The book also speaks on bikini’s that are made for infants. This is ridiculous and it bothered me.
I could not imagine how much stress this puts on young ladies of all walks. How do they find and value their true self worth when they are constantly shown and told and that skinny and naked is what’s acceptable and anything else is too conservative and won’t bring them better opportunity?
For years I was blaming females for the things they wore as an example but I was so so wrong. It’s society as a whole that is promoting sex as a means to more opportunity for women. It’s wrong and a majority of men are not sexualized and told to become sex symbols in this manor.
This type of thinking and behavior causes women to accept being looked at as objects instead of incredibly capable human beings.
As a male I am looking at this from both sides and still have trouble taking this all in based on a lifetime of programming and being oblivious to what’s really happening in America around the world.
Even the fact that a male and female working in the same position will be paid differently (the male being paid more) is something that is looked at as normal today. It’s not normal or right.
When will we ever reach true equality for all?
Sad to say I’m not sure we will ever reach it. What is your take on any of this?
I hope I don’t offend anybody. Just giving my opinion on the matter.