ELI5: Explain Like I'm 5

West Florida

West Florida is a place in the southern part of the United States. It used to be its own separate area a very long time ago before it became a part of the state of Florida.

Picture a pizza. The pizza is like the state of Florida. West Florida would be one of the slices of pizza.

So, people used to live in West Florida and do things like farm, fish, and make clothes. They also had their own government and leaders who helped make decisions about what was best for the people who lived there.

Eventually, West Florida became a part of the bigger state of Florida and now it is just a regular region within that state. But people still live there and do all sorts of fun things like go to the beach, eat yummy food, and visit cool places like museums and amusement parks.