Cross entropy is a way to measure how different two sets of numbers are. It could be used to measure how different two sets of probabilities are, or how different two images are. To find the cross entropy we compare the two sets of numbers (or two images) and say how far apart they are from each other. If the two sets of numbers (or images) are very different from each other, then the cross entropy will be high; if the two sets of numbers (or images) are very similar to each other, then the cross entropy will be low.