Order of approximation is a way of measuring how accurate an approximate answer is. It lets us compare different approximations and see which one is better.
Imagine you have a really tall building and you want to know its height. You don't have a way to measure it exactly, so you guess. You take a ruler and measure the length of its shadow, and then you measure the length of your own shadow. Then, you do some simple math and make an estimate of the building's height. This estimate is your approximation.
The order of approximation is a way to measure how close your estimate is to the actual height. It tells you how accurate your approximation is. The higher the order of approximation, the more accurate your estimate will be.