ELI5: Explain Like I'm 5

Feminism in the United States

Feminism is a way of thinking that believes that men and women should be treated the same, and be given the same opportunities, rights and privileges. In the United States, women have fought for their rights and equality for a very long time.

A long time ago, men were considered more important than women. Women were not allowed to vote, own property, or work in many jobs that men did. This was really unfair, and so women started to protest and fight for their rights.

One of the most important movements in the history of feminism in the United States was the Women's Suffrage Movement, which started in the late 19th century. This movement aimed to give women the right to vote, which they finally achieved in 1920.

Since then, many other feminist movements have taken place in the United States, each one trying to make things better for women. These movements have tried to bring attention to issues such as equal pay for equal work, sexual harassment, and sexual assault, among other things.

Today, feminism is still an important belief that is shared by many people, both men and women. Many people continue to fight for women's rights and equality in the United States, and this has led to some significant changes for women over the years. However, there is still a lot of work to be done to ensure that all women are treated fairly and equally.