I've noticed that the word "feminism" itself has earned itself a negative connotation with a hefty amount of people. According to Merriam Webster, feminism is defined as 1. "the theory of the political, economic, and social equality of the sexes" and 2. "organized activity on behalf of women's rights and interests". However, because of the first half of the word, many people misinterpret it into thinking feminism is promoting an ideal where women should be held above men aka misandry. Of course, feminism was a term first penned during the movement for women's suffrage, hence the reason for it using "fem-".
Someone suggested that the word be changed to something more neutral in order to diminish the incorrect connotation and gather more supporters. I personally prefer "feminism", as there is still a need for the empowerment of women, plus simply the history of the word. Thoughts?