As a feminist, I feel like we are loosing sight on what feminism is. These days people say they are feminists because it’s the cool thing to be.
Also, some feminists are really more against men than they are against gender inequality. With the rise of Radical Feminism, we have forgotten that feminism is genuinely about: EQUAL RIGHTS FOR BOTH WOMEN AND MEN.
Let us not neglect the boy child while uplifting the girl child. What do you think?