Entropy is a measure of randomness or disorder, and entropy units are a measure of how random or disorderly something is. Entropy units measure how much energy the system needs to become more organized. For example, if you started with a pile of books on the floor, the disorder (or entropy) of the room would be high. To make the room more organized and decrease the entropy, you would need to use some energy, like picking up each book and putting it in the right spot on the shelf. Entropy units measure how much energy you would need to use to decrease the entropy of a system.