Okay kiddo, so some women have been saying they are against something called feminism. Feminism is when people believe that men and women should have equal rights and opportunities. Some women say they don't need feminism because they feel like they already have equal rights and opportunities. Others say they don't want to be associated with the negative stereotypes that some people have about feminists. Maybe it's like not wanting to be part of a club that not everyone likes or agrees with. Does that help explain it?