ELI5: Explain Like I'm 5

Shannon entropy

Shannon entropy is a way to measure the amount of uncertainty or surprise in a message or set of data. Imagine playing with a bag of different colored balls. If you know all the balls in the bag are red, you aren't surprised when you reach in and pull one out. However, if the bag has a mix of colors, you might be more surprised by what color you pull out.

Shannon entropy works the same way. The more uncertain or random a message or data set is, the higher its entropy. For example, if you're flipping a fair coin, the entropy is high because there's an equal chance of getting heads or tails. But if you already know the results of the previous flips, the entropy decreases because you have more information and less uncertainty.

Mathematically, Shannon entropy is calculated by looking at the probability of each possible outcome and taking the negative sum of each outcome's probability times its logarithm to the base 2. Don't worry too much about the math - the important thing to understand is that Shannon entropy is a way to measure how uncertain or "surprising" a message or data set is.