Monday, October 14, 2013
Is Feminism becoming a fad?
This has been on my mind lately if Feminism is becoming one of those trendy things to be, you know how being bi was or trying to be a hipster you know what I'm talking about? I just feel like some girls who claim to be feminist aren't really the true definition of what a feminist is. Feminism is all about women being treated with respect and just being equal with men in all aspects. It's not about showing dominance over men or belittling them because that goes against everything they believe in. Or rioting over media sexualizing women such as Miley Cyrus, Mila Kunis and other young scarlet's that just doesn't make any sense to me because probley half the reason they twerk or take half naked photos is they want to be known as sexy and get people to talk about them, we obviously know that there's more to them even the stupidest people can figure that out. Brandi and I like to call these types of feminist flower crown feminist because when you see them protesting or just pictures of them in general their wearing flower crowns. I hate also how some of these women just downgrade men by saying every little action they do for a lady is to get a blowjob in return or something in that matter or they feel like they have all the power in life because they have a dick it just doesn't make any sense to me because your being sexist towards men and viewing them as one way, so how do you expect men to view us more then a good laid or making a sandwich in the kitchen granted there are men like that out there and it's horrible but why do they make it sound like the gender as a whole is like that. I consider myself a feminist with the respect of I just want equal pay for doing the same job as a man, I want respect in the work field if I'm a bossy CEO of a company that I'll be considered as a hard-working respectful boss and not a bitch. I don't know I've just been hearing this word a lot lately and people writing, blogging, posting videos about it and girls just now all the sudden come out saying their a feminist and acting like go women this! fuck men! liberate womenhood! and just seems like a big fad to me doesn't even seem like some of these women/ girls believe or even know the whole point of feminism, it has nothing to do about liberating yourself or you being over-sexualized or anything to do with what their perching about that's just our society and ya it sucks and ya it's hard being a girl in society today and sure we can't go anywhere without being looked at or called at but stop saying your a feminist, you just hate how society is and it has nothing to do with rights of women! And just a disclaimer this is purely my opinion, I'm sorry if I offended you but please bitch about it to your friends or anyone don't leave mean or rude comments on here. I just wrote this because it's been on my mind lately and like I said just hearing a lot of talk about being one and I'm just getting tired of it in all honestly.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment