02-28-2014, 07:40 PM
(02-28-2014, 05:57 PM)Koopaul Wrote: Well maybe your right. Provide examples outside entertainment where our society says it's okay to objectify women.
I know there's dick heads who treat women like trash. But no one says what those guys do is okay. Right?
The way I see it, just because we know violence is wrong doesn't stop bad people from being violent. And just because we know sexism is wrong doesn't stop bad people from being sexist.
But that is where we unfortunately disagree. I wish we could all agree.
I fucking hate to pull the "check your privilege" card but... here it goes.
You simply don't understand how it is to be a women, because you're a man. You'll never have to deal with the sort of things they do because you aren't them.
Why do you thing that the biggest way a lot of advertisers sell things is by slapping a pair of tits on it? Or why do you think it's not all that uncommon for a woman just walking down the street to be catcalled and ogled? It sure as hell isn't because they care about the person as a human, that's for sure. I could go on and on, about how women are able to be called things like "hot pieces of ass" and not an eyebrow would be raised. It's such a pervasive part of society that it's hard to convince people who don't see the problem that it's actually a problem.
And why do you think that the people actually raising a stink about it isn't the majority? Because society for the most part doesn't fucking care that these things go on. "It's just part of life." "It's not that big a deal." "What's the problem?" By being indifferent, people are essentially cosigning this behavior into the big list of "acceptable things to do."
I'm not gonna blame them. They just don't know better. But things aren't gonna get better until people actually fucking start correcting the people who are acting in damaging ways. Things won't get better if we just act indifferent to the injustices going on around us.