ELI5: Explain Like I'm 5

End of World War II in Europe

At the end of World War II in Europe, the Allied Powers (Australia, Britain, France, the United States, and other countries) were declared the winners. The war had been going on since 1939, when Germany invaded Poland. The Allied Powers fought Germany, Italy, and Japan, and eventually, they were able to stop them and defeated them. This victory brought peace to Europe and marked the end of the war.