Imperialism is a policy where one country takes control of other countries. It has happened throughout history, but it became a lot more common during the 19th century.
During that time, Europe was very powerful, so many European countries like England, France, and Spain started to take control of other countries in Africa, Asia and the Americas. This helped them gain wealth and resources, and sometimes they also took control of the people living in those countries.
Imperialism was a pretty big deal in the 1800s, and it had a big impact on how the world looked. But by the end of the century, some countries, like the United States, began to stand up to the European countries and say it wasn’t right for them to take control of other countries.
This led to an end of imperialism and a start of a movement for self-determination and independence. In the years following the 1800s, many countries gained independence from their European rulers and were able to govern themselves. This had a big impact on our world today, as it allowed countries to choose how they wanted to be governed, instead of being controlled by another country.