The most popular social topic among Liberals and Conservatives in USA now, I think everyone should know about it already:Feminism. In our old mind we think that Feminism means equality: Women should be treated the same as men no matter what kind of position is, no matter where they from, should get paid the same. Then later it turned in to a revolution about Men hating.
It's kind of absurd that those women took the advantage of this Feminism thing ,and go over the board,trying to rob the society with "moral",and get what they want.
First, women don't born the same as men ,by saying this I meant body structure. Most men are born more muscular and aggressive since the hormone in their body is high, I don't think I should explain this to you when you learned it in school when you were 12 years old.
Second,when you go to apply for a job, you don't see a lot of women applying for working in the coal mine, in the sewer , construction etc. those dirty nasty places. Those jobs need strong people to complete the job, and in general, normal people will link that idea to hard working men already. So why do you think that women should be paid the same as men while most women just wanted to apply for a cashier jobs or office jobs, the easier jobs?
Third, everybody has a right to say what they want, I don't disagree, but trying to impose your ideas on other people is not right,and that most feminists do in real life, they wanted to get this right, or that right for feminists,oh well you don't run the world.
Finally, I think everyone should just get back to their real life, and start spending times with their families and friends, doing some meaningless stuffs, than running on the street protesting for no reasons.
Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
If you enjoyed what you read here, create your account today and start earning FREE STEEM!