Method of moments is a way of finding a good estimate of the values of something, like a probability. It works by comparing the number of times something is expected to happen with the number of times it actually happened, according to a certain set of rules. For example, let's say you just rolled a die five times, and each time it came up six. We could use the method of moments to figure out what the probability of getting a six is.
The method works like this: we count up the number of times each possible number came up (so five sixes in this case). Then we figure out how many times each number should have come up if the die was fair. We can do this by using the formula "expected times something happens = total number of tries x probability of something happening". So if the die was fair, there should have been an equal number of times each number came up.
But since we got five sixes and no other numbers, the expected number of sixes was one-sixth of the total tries (1/6 x 5 = 0.83). This isn't equal to the actual number of sixes (5), which means the probability of getting a six isn't one-sixth.
Using the method of moments, we can figure out that the probability of getting a six is actually five-sixths (5/6).