Okay, let me explain errors and residuals in a really simple way, like you are 5 years old!
Let's say you are playing a game of darts and your goal is to hit a certain spot on the board. If you throw the dart and it lands exactly where you wanted it to, then you're done and you have no errors or residuals. But, if you throw the dart and it lands off the mark, then you have an error or a residual.
Errors are the difference between the actual value of something and the predicted value. In our dart example, if you wanted to hit the bullseye but the dart lands 2 inches to the left of it, then you have an error of 2 inches. Errors can be positive or negative depending on whether you overestimated or underestimated.
Now, residuals are very similar to errors in that they also measure the difference between the actual and predicted value. However, residuals are specifically used in statistics to measure how much the data points in a dataset differ from the line of best fit.
Let's imagine that you have a scatterplot of data points and you are trying to draw a straight line through them to best represent the relationship between the two variables. However, it is unlikely that this line will exactly match every single point on the scatterplot. The difference between the actual point and the predicted position on the line is called the residual.
So, in short, errors and residuals both measure the difference between what you predicted and what actually happened. The difference is that errors are used more generally to compare absolute values, while residuals are specifically used in statistics to measure how well a model fits the data.