As a feminist, I feel like we are loosing sight on what feminism is. These days people say they are feminists because its the cool thing to be. Also, some feminists are really more against men than they are against gender inequality. We have forgotten that feminism is about EQUAL RIGHTS FOR BOTH WOMEN AND MEN. Let us not neglect the boy child while uplifting the girl child. What do you think?