Dominionism is a type of Christian belief that says Christians should take control of society and govern it according to Biblical principles. This includes laws about religion, sexuality and even financial matters. Some people think that Dominionism is a dangerous way of thinking, because it could lead to discrimination against those who don't follow its teachings. But Dominionists believe that it is their duty to spread the moral teachings of the Bible, so they can make the world a better place.